Table of Contents

What is robotics, types of robots, advantages and disadvantages of robots, the future of robotics: what’s the use of ai in robotics, a word about robot software, the future of robotics and robots, the future of robotics: how robots will change the world, choose the right program, how to get started in robotics, the future of robotics: how robots will transform our lives.

The Future of Robotics: How Robots Will Transform Our Lives

What comes to mind when you hear the word “robot”? Do you picture a metallic humanoid in a spaceship in the distant future? Perhaps you imagine a dystopian future where humanity is enslaved by its robot overlords. Or maybe you think of an automobile assembly line with robot-like machines putting cars together.

Whatever you think, one thing is sure: robots are here to stay. Fortunately, it seems likely that robots will be more about doing repetitive or dangerous tasks than seizing supreme executive power. Let’s look at robotics, defining and classifying the term, figuring out the role of Artificial Intelligence in the field, the future of robotics, and how robotics will change our lives.

Robotics is the engineering branch that deals with the conception, design, construction, operation, application, and usage of robots. Digging a little deeper, we see that robots are defined as an automatically operated machine that carries out a series of actions independently and does the work usually accomplished by a human.

Incidentally, robots don’t have to resemble humans , although some do. Look at images of automobile assembly lines for proof. Robots that appear human are typically referred to as “androids.” Although robot designers make their creations appear human so that people feel more at ease around them, it’s not always the case. Some people find robots, especially ones that resemble people, creepy.

Robots are versatile machines, evidenced by their wide variety of forms and functions. Here's a list of a few kinds of robots we see today:

  • Healthcare: Robots in the healthcare industry do everything from assisting in surgery to physical therapy to help people walk to moving through hospitals and delivering essential supplies such as meds or linens. Healthcare robots have even contributed to the ongoing fight against the pandemic, filling and sealing testing swabs and producing respirators.
  • Homelife: You need look no further than a Roomba to find a robot in someone's house. But they do more now than vacuuming floors; home-based robots can mow lawns or augment tools like Alexa.
  • Manufacturing: The field of manufacturing was the first to adopt robots, such as the automobile assembly line machines we previously mentioned. Industrial robots handle a various tasks like arc welding, material handling, steel cutting, and food packaging.
  • Logistics: Everybody wants their online orders delivered on time, if not sooner. So companies employ robots to stack warehouse shelves, retrieve goods, and even conduct short-range deliveries.
  • Space Exploration: Mars explorers such as Sojourner and Perseverance are robots. The Hubble telescope is classified as a robot, as are deep space probes like Voyager and Cassini.
  • Military: Robots handle dangerous tasks, and it doesn't get any more difficult than modern warfare. Consequently, the military enjoys a diverse selection of robots equipped to address many of the riskier jobs associated with war. For example, there's the Centaur, an explosive detection/disposal robot that looks for mines and IEDs, the MUTT, which follows soldiers around and totes their gear, and SAFFiR, which fights fires that break out on naval vessels.
  • Entertainment: We already have toy robots, robot statues, and robot restaurants. As robots become more sophisticated, expect their entertainment value to rise accordingly.
  • Travel: We only need to say three words: self-driving vehicles.

Become a AI & Machine Learning Professional

  • $267 billion Expected Global AI Market Value By 2027
  • 37.3% Projected CAGR Of The Global AI Market From 2023-2030
  • $15.7 trillion Expected Total Contribution Of AI To The Global Economy By 2030

Artificial Intelligence Engineer

  • Industry-recognized AI Engineer Master’s certificate from Simplilearn
  • Dedicated live sessions by faculty of industry experts

Post Graduate Program in AI and Machine Learning

  • Program completion certificate from Purdue University and Simplilearn
  • Gain exposure to ChatGPT, OpenAI, Dall-E, Midjourney & other prominent tools

Here's what learners are saying regarding our programs:

Indrakala Nigam Beniwal

Indrakala Nigam Beniwal

Technical consultant , land transport authority (lta) singapore.

I completed a Master's Program in Artificial Intelligence Engineer with flying colors from Simplilearn. Thanks to the course teachers and others associated with designing such a wonderful learning experience.

Akili Yang

Personal Financial Consultant , OCBC Bank

The live sessions were quite good; you could ask questions and clear doubts. Also, the self-paced videos can be played conveniently, and any course part can be revisited. The hands-on projects were also perfect for practice; we could use the knowledge we acquired while doing the projects and apply it in real life.

Like any innovation today, robots have their plusses and negatives. Here’s a breakdown of the good and bad about robots and the future of robotics.

  • They work in hazardous environments: Why risk human lives when you can send a robot in to do the job? Consider how preferable it is to have a robot fighting a fire or working on a nuclear reactor core.
  • They’re cost-effective: Robots don’t take sick days or coffee breaks, nor need perks like life insurance, paid time off, or healthcare offerings like dental and vision.
  • They increase productivity: Robots are wired to perform repetitive tasks ad infinitum; the human brain is not. Industries use robots to accomplish the tedious, redundant work, freeing employees to tackle more challenging tasks and even learn new skills.
  • They offer better quality assurance: Vigilance decrement is a lapse in concentration that hits workers who repeatedly perform the same functions. As the human’s concentration level drops, the likelihood of errors, poor results, or even accidents increases. Robots perform repetitive tasks flawlessly without having their performance slip due to boredom.

Disadvantages

  • They incur deep startup costs: Robot implementation is an investment risk, and it costs a lot. Although most manufacturers eventually see a recoup of their investment over the long run, it's expensive in the short term. However, this is a common obstacle in new technological implementation, like setting up a wireless network or performing cloud migration.
  • They might take away jobs: Yes, some people have been replaced by robots in certain situations, like assembly lines, for instance. Whenever the business sector incorporates game-changing technology, some jobs become casualties. However, this disadvantage might be overstated because robot implementation typically creates a greater demand for people to support the technology, which brings up the final disadvantage.
  • They require companies to hire skilled support staff: This drawback is good news for potential employees , but bad news for thrifty-minded companies. Robots require programmers, operators, and repair personnel. While job seekers may rejoice, the prospect of having to recruit professionals (and pay professional-level salaries!) may serve as an impediment to implementing robots.

Artificial Intelligence (AI) increases human-robot interaction, collaboration opportunities, and quality. The industrial sector already has co-bots, which are robots that work alongside humans to perform testing and assembly.

Advances in AI help robots mimic human behavior more closely, which is why they were created in the first place. Robots that act and think more like people can integrate better into the workforce and bring a level of efficiency unmatched by human employees.

Robot designers use Artificial Intelligence to give their creations enhanced capabilities like:

  • Computer Vision: Robots can identify and recognize objects they meet, discern details, and learn how to navigate or avoid specific items.
  • Manipulation: AI helps robots gain the fine motor skills needed to grasp objects without destroying the item.
  • Motion Control and Navigation: Robots no longer need humans to guide them along paths and process flows. AI enables robots to analyze their environment and self-navigate. This capability even applies to the virtual world of software. AI helps robot software processes avoid flow bottlenecks or process exceptions.
  • Natural Language Processing (NLP) and Real-World Perception: Artificial Intelligence and Machine Learning (ML) help robots better understand their surroundings, recognize and identify patterns, and comprehend data. These improvements increase the robot’s autonomy and decrease reliance on human agents.

Software robots are computer programs that perform tasks without human intervention, such as web crawlers or chatbots . These robots are entirely virtual and not considered actual robots since they have no physical characteristics.

This technology shouldn't be confused with robotic software loaded into a robot and determines its programming. However, it's normal to experience overlap between the two entities since, in both cases, the software is helping the entity (robot or computer program) perform its functions independent of human interaction.

Thanks to improved sensor technology and more remarkable advances in Machine Learning and Artificial Intelligence, robots will keep moving from mere rote machines to collaborators with cognitive functions. These advances, and other associated fields, are enjoying an upwards trajectory, and robotics will significantly benefit from these strides.

We can expect to see more significant numbers of increasingly sophisticated robots incorporated into more areas of life, working with humans. Contrary to dystopian-minded prophets of doom, these improved robots will not replace workers. Industries rise and fall, and some become obsolete in the face of new technologies, bringing new opportunities for employment and education.

That’s the case with robots. Perhaps there will be fewer human workers welding automobile frames, but there will be a greater need for skilled technicians to program, maintain, and repair the machines. In many cases, this means that employees could receive valuable in-house training and upskilling, giving them a set of skills that could apply to robot programming and maintenance and other fields and industries.

Robots will increase economic growth and productivity and create new career opportunities for many people worldwide. However, there are still warnings out there about massive job losses, forecasting losses of 20 million manufacturing jobs by 2030, or how 30% of all jobs could be automated by 2030 .

But thanks to the consistent levels of precision that robots offer, we can look forward to robots handling more of the burdensome, redundant manual labor tasks, making transportation work more efficiently, improving healthcare, and freeing people to improve themselves. But, of course, time will tell how this all works out.

Supercharge your career in AI and ML with Simplilearn's comprehensive courses. Gain the skills and knowledge to transform industries and unleash your true potential. Enroll now and unlock limitless possibilities!

Program Name AI Engineer Post Graduate Program In Artificial Intelligence Artificial Intelligence & Machine Learning Bootcamp Geo All Geos All Geos US University Simplilearn Purdue Caltech Course Duration 11 Months 11 Months 6 Months Coding Experience Required Basic Basic Yes Skills You Will Learn 10+ skills including data structure, data manipulation, NumPy, Scikit-Learn, Tableau and more. 16+ skills including chatbots, NLP, Python, Keras and more. 12+ skills including Ensemble Learning, Python, Computer Vision, Statistics and more. Additional Benefits Get access to exclusive Hackathons, Masterclasses and Ask-Me-Anything sessions by IBM Applied learning via 3 Capstone and 12 Industry-relevant Projects Purdue Alumni Association Membership Free IIMJobs Pro-Membership of 6 months Resume Building Assistance 22 CEU Credits Caltech CTME Circle Membership Cost $$ $$$$ $$$ Explore Program Explore Program Explore Program

If you want to become part of the robot revolution (revolutionizing how we live and work, not an actual overthrow of humanity), Simplilearn has what you need to get started. The AI and Machine Learning Bootcamp , delivered in partnership with IBM and Caltech, covers vital robot-related concepts such as statistics, data science with Python, Machine Learning, deep learning, NLP, and reinforcement learning.

The bootcamp covers the latest tools and technologies from the AI ecosystem, featuring masterclasses by Caltech instructors and IBM experts, including hackathons and Ask Me Anything sessions conducted by IBM.

According to Ziprecruiter, AI Engineers in the US can earn a yearly average of $164,769, and Glassdoor reports that similar positions in India pay an annual average of ₹949,364.

Visit Simplilearn today and start an exciting new career with a fantastic future!

Our AI & Machine Learning Courses Duration And Fees

AI & Machine Learning Courses typically range from a few weeks to several months, with fees varying based on program and institution.

Program NameDurationFees

Cohort Starts:

16 weeks€ 2,199

Cohort Starts:

11 months€ 2,290

Cohort Starts:

16 weeks€ 2,199

Cohort Starts:

11 Months€ 3,990

Cohort Starts:

11 months€ 2,990

Cohort Starts:

16 weeks€ 2,490
11 Months€ 1,490

Get Free Certifications with free video courses

Machine Learning using Python

AI & Machine Learning

Machine Learning using Python

Artificial Intelligence Beginners Guide: What is AI?

Artificial Intelligence Beginners Guide: What is AI?

Learn from Industry Experts with free Masterclasses

Kickstart Your Gen AI & ML Career on a High-Growth Path in 2024 with IIT Kanpur

Ethics in Generative AI: Why It Matters and What Benefits It Brings

Navigating the GenAI Frontier

Recommended Reads

Digital Transformation and Future of Tech Jobs in India: A Simplilearn Report 2020

How to Become a Robotics Engineer?

The Top Five Humanoid Robots

Report: The Future of IT Jobs in India

Robotics Engineer Salary by Experience and Location

The Top 10 Most Amazing RPA Projects of All Time

Get Affiliated Certifications with Live Class programs

  • PMP, PMI, PMBOK, CAPM, PgMP, PfMP, ACP, PBA, RMP, SP, and OPM3 are registered marks of the Project Management Institute, Inc.

Home

Study at Cambridge

About the university, research at cambridge.

  • For Cambridge students
  • For our researchers
  • Business and enterprise
  • Colleges and Departments
  • Email and phone search
  • Give to Cambridge
  • Museums and collections
  • Events and open days
  • Fees and finance
  • Postgraduate courses
  • How to apply
  • Fees and funding
  • Postgraduate events
  • International students
  • Continuing education
  • Executive and professional education
  • Courses in education
  • How the University and Colleges work
  • Visiting the University
  • Annual reports
  • Equality and diversity
  • A global university
  • Public engagement

We ask the experts: will robots take over the world?

  • Research home
  • About research overview
  • Animal research overview
  • Overseeing animal research overview
  • The Animal Welfare and Ethical Review Body
  • Animal welfare and ethics
  • Report on the allegations and matters raised in the BUAV report
  • What types of animal do we use? overview
  • Guinea pigs
  • Equine species
  • Naked mole-rats
  • Non-human primates (marmosets)
  • Other birds
  • Non-technical summaries
  • Animal Welfare Policy
  • Alternatives to animal use
  • Further information
  • Funding Agency Committee Members
  • Research integrity
  • Horizons magazine
  • Strategic Initiatives & Networks
  • Nobel Prize
  • Interdisciplinary Research Centres
  • Open access
  • Energy sector partnerships
  • Podcasts overview
  • S2 ep1: What is the future?
  • S2 ep2: What did the future look like in the past?
  • S2 ep3: What is the future of wellbeing?
  • S2 ep4 What would a more just future look like?

a speech about robots

Robots can do a lot for us: they can explore space or they can cut our toenails. But do advances in robotics and artificial intelligence hold hidden threats? Three leaders in their fields answer questions about our relationships with robots.

By the end of the century, the entire solar system -- planets, moons and asteroids -- will be explored and mapped by flotillas of tiny robotic craft Martin Rees

The origins of robotics go back to the automata invented by ancient civilisations. The word robot entered our vocabulary only in 1920 with Czech writer Karel Čapek’s play R.U.R (Rossum’s Universal Robots).  Over the past 20 years robots have been developed to work in settings that range from manufacturing industry to space. At Cambridge University, robotics is a rapidly developing field within many departments, from theoretical physics and computing to engineering and medical science. Lord Martin Rees is Emeritus Professor of Cosmology and Astrophysics at the University of Cambridge. He holds the honorary title of Astronomer Royal.  Lord Rees is co-founder of the Centre for the Study of the Existential Risk , an early stage initiative which brings together a scientist, philosopher and software entrepreneur. Kathleen Richardson is an anthropologist of robots. She took her PhD at Cambridge and recently completed a postdoctoral fellowship at UCL. She is writing a book that explores the representational models used by scientists and how they influence ideas we have about robots as potential friends or enemies. Daniel Wolpert is a Royal Research Society Professor in the Department of Engineering, Cambridge University. His expertise lies in bioengineering and especially the mechanisms that control interactions between brain and body. The focus of his research group is an understanding of movement, which he believes is central to all human activities.

a speech about robots

This work is licensed under a Creative Commons Licence . If you use this content on your site please link back to this page.

Read this next

a speech about robots

AI takes flight to revolutionise forest monitoring

L-R: Professor John Morton (UCL), Professor Rachel McKendry (UCL), Professor Mete Atatüre (Cambridge), Professor Eleni Nastouli (UCL)

Five hubs launched to ensure UK benefits from quantum future

Maxwell Centre, University of Cambridge

Cambridge and SAS launch partnership in AI and advanced analytics to accelerate innovation in the healthcare sector

Child playing on tablet

AI Chatbots have shown they have an ‘empathy gap’ that children are likely to miss

Search research, sign up to receive our weekly research email.

Our selection of the week's biggest Cambridge research news sent directly to your inbox. Enter your email address, confirm you're happy to receive our emails and then select 'Subscribe'.

I wish to receive a weekly Cambridge research news summary by email.

The University of Cambridge will use your email address to send you our weekly research news email. We are committed to protecting your personal information and being transparent about what information we hold. Please read our email privacy notice for details.

  • Artificial intelligence
  • nanotechnology
  • Engineering
  • Martin Rees
  • Kathleen Richardson
  • Daniel Wolpert
  • Institute of Astronomy
  • Department of Engineering
  • Trinity Hall
  • School of the Physical Sciences
  • School of Technology
  • Trinity College

Related organisations

  • University College London (UCL)

Connect with us

Cambridge University

© 2024 University of Cambridge

  • Contact the University
  • Accessibility statement
  • Freedom of information
  • Privacy policy and cookies
  • Statement on Modern Slavery
  • Terms and conditions
  • University A-Z
  • Undergraduate
  • Postgraduate
  • Cambridge University Press & Assessment
  • Research news
  • About research at Cambridge
  • Spotlight on...

a speech about robots

Robotics: What Are Robots? Robotics Definition & Uses.

Robotics definition.

Robotics is an interdisciplinary sector of science and engineering dedicated to the design, construction and use of mechanical robots. Our guide will give you a concrete grasp of robotics, including different types of robots and how they’re being applied across industries.

industrial robots on assembly line

What Is Robotics?

Robotics is the intersection of science, engineering and technology that produces machines, called robots, that replicate or substitute for human actions. Robots perform basic and repetitive tasks with greater efficiency and accuracy than humans, making them ideal for industries like manufacturing. However, the introduction of artificial intelligence in robotics has given robots the ability to handle increasingly complex situations in various industries.

What Is a Robot?

A robot is a programmable machine that can complete a task, while the term robotics describes the field of study focused on developing robots and automation. Each robot has a different level of autonomy. These levels range from human-controlled bots that carry out tasks to fully-autonomous bots that perform tasks without any external influences.

In terms of etymology, the word ‘robot’ is derived from the Czech word robota , which means “forced labor.” The word first appeared in the 1920 play R.U.R. , in reference to the play’s characters who were mass-produced workers incapable of creative thinking.

Robotics Aspects

Mechanical construction.

The mechanical aspect of a robot helps it complete tasks in the environment for which it’s designed. For example, the Mars 2020 Rover’s wheels are individually motorized and made of titanium tubing that help it firmly grip the harsh terrain of the red planet.

Electrical Components

Robots need electrical components that control and power the machinery. Essentially, an electric current — a battery, for example — is needed to power a large majority of robots.

Software Program

Robots contain at least some level of computer programming. Without a set of code telling it what to do, a robot would just be another piece of simple machinery. Inserting a program into a robot gives it the ability to know when and how to carry out a task.

What Are the Main Components of a Robot?

Control system.

Computation includes all of the components that make up a robot’s central processing unit, often referred to as its control system. Control systems are programmed to tell a robot how to utilize its specific components, similar in some ways to how the human brain sends signals throughout the body, in order to complete a specific task. These robotic tasks could comprise anything from minimally invasive surgery to assembly line packing.

Sensors provide a robot with stimuli in the form of electrical signals that are processed by the controller and allow the robot to interact with the outside world. Common sensors found within robots include video cameras that function as eyes, photoresistors that react to light and microphones that operate like ears. These sensors allow the robot to capture its surroundings and process the most logical conclusion based on the current moment and allows the controller to relay commands to the additional components.

A device can only be considered to be a robot if it has a movable frame or body. Actuators are the components that are responsible for this movement. These components are made up of motors that receive signals from the control system and move in tandem to carry out the movement necessary to complete the assigned task. Actuators can be made of a variety of materials, such as metal or elastic, and are commonly operated by use of compressed air (pneumatic actuators) or oil (hydraulic actuators) but come in a variety of formats to best fulfill their specialized roles.

Power Supply

Like the human body requires food in order to function, robots require power. Stationary robots, such as those found in a factory, may run on AC power through a wall outlet but more commonly, robots operate via an internal battery. Most robots utilize lead-acid batteries for their safe qualities and long shelf life while others may utilize the more compact but also more expensive silver-cadmium variety. Safety, weight, replaceability and lifecycle are all important factors to consider when designing a robot’s power supply. 

Some potential power sources for future robotic development also include pneumatic power from compressed gasses, solar power, hydraulic power, flywheel energy storage organic garbage through anaerobic digestion and nuclear power.

End Effectors

End effectors are the physical, typically external components that allow robots to finish carrying out their tasks. Robots in factories often have interchangeable tools like paint sprayers and drills, surgical robots may be equipped with scalpels and other kinds of robots can be built with gripping claws or even hands for tasks like deliveries, packing, bomb diffusion and much more.

How Do Robots Work?

Some robots are pre-programmed to perform specific functions, meaning they operate in a controlled environment where they do simple, monotonous tasks — like a mechanical arm on an automotive assembly line.

Other robots are autonomous, operating independently of human operators to carry out tasks in open environments. In order to work, they use sensors to perceive the world around them, and then employ decision-making structures (usually a computer) to take the optimal next step based on their data and mission.

Robots may also work by using wireless networks to enable human control from a safe distance. These teleoperated robots usually work in extreme geographical conditions, weather and circumstances. Examples of teleoperated robots are the human-controlled submarines used to fix underwater pipe leaks during the BP oil spill or drones used to detect landmines on a battlefield.

Types of Robotics

Humanoid robots.

Humanoid robots are robots that look like or mimic human behavior. These robots usually perform human-like activities (like running, jumping and carrying objects), and are sometimes designed to look like us, even having human faces and expressions. Two of the most prominent examples of humanoid robots are Hanson Robotics’ Sophia and Boston Dynamics’ Atlas .

Cobots , or collaborative robots, are robots designed to work alongside humans. These robots prioritize safety by using sensors to remain aware of their surroundings, executing slow movements and ceasing actions when their movements are obstructed. Cobots typically perform simple tasks, freeing up humans to address more complex work.

Industrial Robots

Industrial robots automate processes in manufacturing environments like factories and warehouses. Possessing at least one robotic arm, these robots are made to handle heavy objects while moving with speed and precision. As a result, industrial robots often work in assembly lines to boost productivity.

Medical Robots

Medical robots assist healthcare professionals in various scenarios and support the physical and mental health of humans. These robots rely on AI and sensors to navigate healthcare facilities, interact with humans and execute precise movements. Some medical robots can even converse with humans, encouraging people’s social and emotional growth.

Agricultural Robots

Agricultural robots handle repetitive and labor-intensive tasks, allowing farmers to use their time and energy more efficiently. These robots also operate in greenhouses, where they monitor crops and help with harvests. Agricultural robots come in many forms, ranging from autonomous tractors to drones that collect data for farmers to analyze.

Microrobotics

Microrobotics is the study and development of robots on a miniature scale. Often no bigger than a millimeter, microrobots can vary in size, depending on the situation. Biotech researchers typically use microrobotics to monitor and treat diseases, with the goal of improving diagnostic tools and creating more targeted solutions.

Augmenting Robots

Augmenting robots, also known as VR robots , either enhance current human capabilities or replace the capabilities a human may have lost. The field of robotics for human augmentation is a field where science fiction could become reality very soon, with bots that have the ability to redefine the definition of humanity by making humans faster and stronger. Some examples of current augmenting robots are robotic prosthetic limbs or exoskeletons used to lift hefty weights.

Software Bots

Software bots, or simply ‘bots,’ are computer programs which carry out tasks autonomously. They are not technically considered robots. One common use case of software robots is a chatbot , which is a computer program that simulates conversation both online and over the phone and is often used in customer service scenarios. Chatbots can either be simple services that answer questions with an automated response or more complex digital assistants that learn from user information.

Robotics Applications

Beginning as a major boon for manufacturers, robotics has become a mainstay technology for a growing number of industries.

Manufacturing

Industrial robots can assemble products, sort items, perform welds and paint objects. They may even be used to fix and maintain other machines in a factory or warehouse. 

Medical robots transport medical supplies, perform surgical procedures and offer emotional support to those going through rehabilitation.  

Companionship

Social robots can support children with learning disabilities and act as a therapeutic tool for people with dementia. They also have business applications like providing in-person customer service in hotels and moving products around warehouses. 

Consumers may be most familiar with the Roomba and other robot vacuum cleaners. However, other home robots include lawn-mowing robots and personal robot assistants that can play music, engage with children and help with household chores.

Search and Rescue

Search and rescue robots can save those stuck in flood waters, deliver supplies to those stranded in remote areas and put out fires when conditions become too extreme for firefighters.

Pros and Cons of Robotics

Robotics comes with a number of benefits and drawbacks.

Pros of Robotics

  • Increased accuracy. Robots can perform movements and actions with greater precision and accuracy than humans.
  • Enhanced productivity. Robots can work at a faster pace than humans and don’t get tired, leading to more consistent and higher-volume production. 
  • Improved safety. Robots can take on tasks and operate in environments unsafe for humans, protecting workers from injuries. 
  • Rapid innovation. Many robots are equipped with sensors and cameras that collect data, so teams can quickly refine processes. 
  • Greater cost-efficiency. Gains in productivity may make robots a more cost-efficient option for businesses compared to hiring more human workers.

Cons of Robotics

  • Job losses. Robotic process automation may put human employees out of work, especially those who don’t have the skills to adapt to a changing workplace.  
  • Limited creativity. Robots may not react well to unexpected situations since they don’t have the same problem-solving skills as humans. 
  • Data security risks. Robots can be hit with cyber attacks, potentially exposing large amounts of data if they’re connected to the Internet of Things.  
  • Maintenance costs. Robots can be expensive to repair and maintain, and faulty equipment can lead to disruptions in production and revenue losses.  
  • Environmental waste. Extracting raw materials to build robots and having to discard disposable parts can lead to more environmental waste and pollution.

humanoid robot

Future of Robotics

The evolution of AI has major implications for the future of robotics. In factories, AI can be combined with robotics to produce digital twins and design simulations to help companies improve their workflows. Advanced AI also gives robots increased autonomy. For example, drones could deliver packages to customers without any human intervention. In addition, robots could be outfitted with generative AI tools like ChatGPT, resulting in more complex human-robot conversations.

As robots’ intelligence has shifted, so too have their appearances. Humanoid robots are designed to visually appeal to humans in various settings while understanding and responding to emotions, carrying objects and navigating environments. With these forms and abilities, robots can become major contributors in customer service, manufacturing, logistics and healthcare, among other industries.

While the spread of robotics has stoked fears over job losses due to automation, robots could simply change the nature of human jobs. Humans may find themselves collaborating with robots, letting their robotic counterparts handle repetitive tasks while they focus on more difficult problems. Either way, humans will need to adapt to the presence of robots as robotics continues to progress alongside other technologies like AI and deep learning.  

old robot

History of Robotics

Robotics as a concept goes back to ancient times. The ancient Greeks combined automation and engineering to create the Antikythera, a handheld device that predicted eclipses. Centuries later, Leonardo Da Vinci designed a mechanical knight now known as “Leonardo’s Robot.” But it was the rise of manufacturing during the Industrial Revolution that highlighted the need for widespread automation.

Following William Grey Walter’s development of the first autonomous robots in 1948, George Devol created the first industrial robotic arm known as Unimate. It began operating at a GM facility in 1959. In 1972, the Stanford Research Institute designed Shakey — the first AI-powered robot. Shakey used cameras and sensors to collect data from its surroundings and inform its next moves.

The ability of robots to perceive their surroundings led researchers to explore whether they could also perceive human emotions. In the late 1990s, MIT’s Dr. Cynthia Breazeal built Kismet, a robotic head that used facial features to express and respond to human emotions. This predecessor to social robots opened the door for future robots like Roomba and consumer-centric inventions like Alexa and other voice assistants.

Robots took another leap forward in 2012 due to a breakthrough in deep learning. Armed with volumes of digital images, British AI expert Geoffrey Hinton and his team successfully trained a system of neural networks to sort over one million images while making few errors. Since then, companies have incorporated deep learning into their technologies, promising more possibilities for robotics.

1700s (1737) Jacques de Vaucanson builds the first biomechanical automaton on record. Called the Flute Player, the mechanical device plays 12 songs.

1920s (1920) The word “robot” makes its first appearance in Karel Capek’s play R.U.R. Robot is derived from the Czech word “robota,” which means “forced labor.”

1930s (1936) Alan Turing publishes “On Computable Numbers,” a paper that introduces the concept of a theoretical computer called the Turing Machine.

1940s (1948) Cybernetics or Control and Communication in the Animal is published by MIT professor Norbert Wiener. The book speaks on the concept of communications and control in electronic, mechanical and biological systems.

(1949) William Grey Walter, a neurophysiologist and inventor, introduces Elmer and Elsie, a pair of battery-operated robots that look like tortoises. The robots move objects, find a source of light and find their way back to a charging station.

1950s (1950) Isaac Asimov publishes the Three Laws of Robotics .

(1950) Alan Turing publishes the paper “Computing Machinery and Intelligence,” proposing what is now known as the Turing Test, a method for determining if a machine is intelligent.

1960s (1961) The first robotic arm works in a General Motors facility. The arm lifts and stacks metal parts and follows a program for approximately 200 movements. The arm was created by George Devol and his partner Joseph Engelberger.

(1969) Victor Scheinman invents the Stanford Arm, a robotic arm with six joints that can mimic the movements of a human arm. It is one of the first robots designed to be controlled by a computer.

1970s (1972) A group of engineers at the Stanford Research Institute create Shakey, the first robot to use artificial intelligence.

(1978) Hiroshi Makino, an automation researcher, designs a four-axis SCARA robotic arm.

1980s (1985) The first documented use of a robot-assisted surgical procedure uses the PUMA 560 robotic surgical arm.

(1985) William Whittaker builds two remotely-operated robots that are sent to the Three Mile Island nuclear power plant.

(1989) MIT researchers Rodney Brooks and A. M. Flynn publish  Fast, Cheap and Out of Control: A Robot Invasion of the Solar System .

(1997) Sojourner lands on Mars. The free-ranging rover sends 2.3 billion bits of data back to Earth.

(1998) Furby, a robotic toy pet developed by Tiger Electronics, is released and eventually sells tens of millions of units. Furbys are preprogrammed to speak gibberish and learn other languages over time. 

(1999) Aibo, a robotic puppy powered by AI hits the commercial market. Developed by Sony, the robotic dog reacts to sounds and has some pre-programmed behavior.

2000s (2000) Cynthia Breazeal creates a robotic head, called Kismet, programmed to provoke emotions as well as react to them.

(2002) iRobot creates Roomba. The vacuum robot is the first robot to become popular in the commercial sector amongst the public. 

(2003) Mick Mountz and the cofounders of Amazon Robotics (formerly Kiva Systems) invent the Kiva robot. The robot maneuvers around warehouses and moves goods.

(2004) Boston Dynamics unveils BigDog, a quadruped robot controlled by humans.

(2004) The Defense Department’s Defense Advanced Research Projects Agency establishes the DARPA Grand Challenge. A self-driving car race that aims to inspire innovation in military autonomous vehicle tech.

2010s (2011) NASA and General Motors collaborate to send Robonaut 2, a humanesque robotic assistant, into space on space shuttle Discovery. The robot becomes a permanent resident of the International Space Station.

(2013) Boston Dynamics releases Atlas, a humanoid biped robot that uses 28 hydraulic joints to mimic human movements — including performing a backflip.

(2012) The first license for a self-driven car is issued in Nevada. The car is a Toyota Prius modified with technology developed by Google. 

(2016) Sophia, a humanoid robot dubbed the first robot citizen, is created by Hanson Robotics. The robot is capable of facial recognition, verbal communication and facial expression.

2020s (2020) Robots are used to distribute Covid-19 tests and vaccinations. 

(2020) 384,000 industrial robots are shipped across the globe to perform various manufacturing and warehouse jobs.  

(2021) Cruise, an autonomous car company, conducts its first two robotaxi test rides in San Francisco.

28 Robotics Companies and Startups on the Forefront of Innovation

At robotics companies across America, the co-mingling of engineering and science is producing some truly innovative products.

humanoid robot

Why Don’t We Have Robot Maids Yet?

The 3 laws of robotics: what are they.

Laws of Robotics

Here’s How AI Is Building a Robot-Filled World

A robot hand typing on a computer keyboard,

What Is Robotic Process Automation (RPA)?

Types of Robots

Types of Robots and How They’re Used

Underwater Robots

Underwater Robotics: How It Works and Examples

What are industrial robots.

industrial robot working on an assembly line

Deep Tech, Explained

shape-shifting robots

What Are Shape-Shifting Robots?

space robot

Why We Send Robots to Space (and 7 Examples)

Robot Conference

7 Extraordinary Robots From the World Robot Conference

Xenobots: the self-replicating living robots.

xenobots populating under a microscope view

21 Examples of Robotic Process Automation

Great companies need great people. that's where we come in..

March 23, 2009

15 min read

Rise of the Robots--The Future of Artificial Intelligence

By 2050 robot "brains" based on computers that execute 100 trillion instructions per second will start rivaling human intelligence

By Hans Moravec

Editor's Note: This article was originally printed in the 2008 Scientific American Special Report on Robots . It is being published on the Web as part of ScientificAmerican.com's In-Depth Report on Robots .

In recent years the mushrooming power, functionality and ubiquity of computers and the Internet have outstripped early forecasts about technology’s rate of advancement and usefulness in everyday life. Alert pundits now foresee a world saturated with powerful computer chips, which will increasingly insinuate themselves into our gadgets, dwellings, apparel and even our bodies.

Yet a closely related goal has remained stubbornly elusive. In stark contrast to the largely unanticipated explosion of computers into the mainstream, the entire endeavor of robotics has failed rather completely to live up to the predictions of the 1950s. In those days experts who were dazzled by the seemingly miraculous calculational ability of computers thought that if only the right software were written, computers could become the artificial brains of sophisticated autonomous robots. Within a decade or two, they believed, such robots would be cleaning our floors, mowing our lawns and, in general, eliminating drudgery from our lives.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

Obviously, it hasn’t turned out that way. It is true that industrial robots have transformed the manufacture of automobiles, among other products. But that kind of automation is a far cry from the versatile, mobile, autonomous creations that so many scientists and engineers have hoped for. In pursuit of such robots, waves of researchers have grown disheartened and scores of start-up companies have gone out of business.

It is not the mechanical “body” that is unattainable; articulated arms and other moving mechanisms adequate for manual work already exist, as the industrial robots attest. Rather it is the computer-based artificial brain that is still well below the level of sophistication needed to build a humanlike robot.

Nevertheless, I am convinced that the decades-old dream of a useful, general-purpose autonomous robot will be realized in the not too distant future. By 2010 we will see mobile robots as big as people but with cognitive abilities similar in many respects to those of a lizard. The machines will be capable of carrying out simple chores, such as vacuuming, dusting, delivering packages and taking out the garbage. By 2040, I believe, we will finally achieve the original goal of robotics and a thematic mainstay of science fiction: a freely moving machine with the intellectual capabilities of a human being.

Reasons for Optimism In light of what I have just described as a history of largely unfulfilled goals in robotics, why do I believe that rapid progress and stunning accomplishments are in the offing? My confidence is based on recent developments in electronics and software, as well as on my own observations of robots, computers and even insects, reptiles and other living things over the past 30 years.

The single best reason for optimism is the soaring performance in recent years of mass-produced computers. Through the 1970s and 1980s, the computers readily available to robotics researchers were capable of executing about one million instructions per second (MIPS). Each of these instructions represented a very basic task, like adding two 10-digit numbers or storing the result in a specified location in memory.

In the 1990s computer power suitable for controlling a research robot shot through 10 MIPS, 100 MIPS and has lately reached 50,000 MIPS in a few high-end desktop computers with multiple processors. Apple’s MacBook laptop computer, with a retail price at the time of this writing of $1,099, achieves about 10,000 MIPS. Thus, functions far beyond the capabilities of robots in the 1970s and 1980s are now coming close to commercial viability.

For example, in October 1995 an experimental vehicle called Navlab V crossed the U.S. from Washington, D.C., to San Diego, driving itself more than 95 percent of the time. The vehicle’s self-driving and navigational system was built around a 25-MIPS laptop based on a microprocessor by Sun Microsystems. The Navlab V was built by the Robotics Institute at Carnegie Mellon University, of which I am a member. Similar robotic vehicles, built by researchers elsewhere in the U.S. and in Germany, have logged thousands of highway kilometers under all kinds of weather and driving con­ditions. Dramatic progress in this field became evident in the DARPA Grand Challenge contests held in California. In October 2005 several fully autonomous cars successfully traversed a hazard-studded 132-mile desert course, and in 2007 several successfully drove for half a day in urban traffic ­conditions.

In other experiments within the past few years, mobile robots mapped and navigated unfamiliar office suites, and computer vision systems located textured objects and tracked and analyzed faces in real time. Meanwhile personal com­puters became much more adept at recognizing text and speech.

Still, computers are no match today for humans in such functions as recognition and navigation. This puzzled experts for many years, because computers are far superior to us in calculation. The explanation of this apparent paradox follows from the fact that the human brain, in its entirety, is not a true programmable, general-purpose computer (what computer scientists refer to as a universal machine; almost all computers nowadays are examples of such machines).

To understand why this is requires an evolutionary perspective. To survive, our early ancestors had to do several things repeatedly and very well: locate food, escape predators, mate and protect offspring. Those tasks depended strongly on the brain’s ability to recognize and navigate. Honed by hundreds of millions of years of evolution, the brain became a kind of ultrasophisticated—but special-­purpose—computer.

The ability to do mathematical calculations, of course, was irrelevant for survival. Nevertheless, as language trans­formed human culture, at least a small part of our brains evolved into a universal machine of sorts. One of the hallmarks of such a machine is its ability to follow an arbitrary set of instructions, and with language, such instructions could be transmitted and carried out. But because we visualize numbers as complex shapes, write them down and perform other such functions, we process digits in a monumentally awkward and inefficient way. We use hundreds of billions of neurons to do in minutes what hundreds of them, specially “rewired” and arranged for calculation, could do in milliseconds.

A tiny minority of people are born with the ability to do seemingly amazing mental calculations. In absolute terms, it’s not so amazing: they calculate at a rate perhaps 100 times that of the average person. Computers, by comparison, are millions or billions of times faster.

Can Hardware Simulate Wetware? The challenge facing roboticists is to take general-­purpose computers and program them to match the largely special-purpose human brain, with its ultraoptimized perceptual inheritance and other peculiar evolutionary traits. Today’s robot-controlling computers are much too feeble to be applied successfully in that role, but it is only a matter of time before they are up to the task.

Implicit in my assertion that computers will eventually be capable of the same kind of perception, cognition and thought as humans is the idea that a sufficiently advanced and sophisticated artificial system—for example, an electronic one—can be made and programmed to do the same thing as the human nervous system, including the brain. This issue is controversial in some circles right now, and there is room for brilliant people to disagree.

At the crux of the matter is the question of whether biological structure and behavior arise entirely from physical law and whether, moreover, physical law is computable—that is to say, amenable to computer simulation. My view is that there is no good scientific evidence to negate either of these propositions. On the contrary, there are compelling indications that both are true.

Molecular biology and neuroscience are steadily uncovering the physical mechanisms underlying life and mind but so far have addressed mainly the simpler mechanisms. Evidence that simple functions can be composed to produce the higher capabilities of nervous systems comes from programs that read, recognize speech, guide robot arms to assemble tight components by feel, classify chemicals by artificial smell and taste, reason about abstract matters, and so on. Of course, computers and robots today fall far short of broad human or even animal competence. But that situation is understandable in light of an analysis, summarized in the next section, that concludes that today’s computers are only powerful enough to function like insect nervous systems. And, in my experience, robots do indeed perform like insects on simple tasks.

Ants, for instance, can follow scent trails but become disoriented when the trail is interrupted. Moths follow pheromone trails and also use the moon for guidance. Similarly, many commercial robots can follow guide wires installed below the surface they move over, and some orient themselves using lasers that read bar codes on walls.

If my assumption that greater computer power will eventually lead to human-level mental capabilities is true, we can expect robots to match and surpass the capacity of various animals and then finally humans as computer-processing rates rise sufficiently high. If on the other hand the assumption is wrong, we will someday find specific animal or human skills that elude implementation in robots even after they have enough computer power to match the whole brain. That would set the stage for a fascinating sci­entific challenge—to somehow isolate and identify the fundamental ability that brains have and that computers lack. But there is no evidence yet for such a missing principle.

The second proposition, that physical law is amenable to computer simulation, is increasingly beyond dispute. Scientists and engineers have already produced countless useful simulations, at various levels of abstraction and approximation, of everything from automobile crashes to the “color” forces that hold quarks and gluons together to make up protons and neutrons.

Nervous Tissue and Computation If we accept that computers will eventually become powerful enough to simulate the mind, the question that naturally arises is: What processing rate will be necessary to yield performance on a par with the human brain? To explore this issue, I have considered the capabilities of the vertebrate retina, which is understood well enough to serve as a Rosetta stone roughly relating nervous tissue to computation. By comparing how fast the neural circuits in the retina perform image-processing operations with how many instructions per second it takes a computer to accomplish similar work, I believe it is possible to at least coarsely estimate the information-processing power of nervous tissue—and by extrapolation, that of the entire human nervous system.

The human retina is a patch of nervous tissue in the back of the eyeball half a millimeter thick and approximately two centimeters across. It consists mostly of light-sensing cells, but one tenth of a millimeter of its thickness is populated by image-processing circuitry that is capable of detecting edges (boundaries between light and dark) and motion for about a million tiny image regions. Each of these regions is associated with its own fiber in the optic nerve, and each performs about 10 detections of an edge or a motion each second. The results flow deeper into the brain along the associated fiber.

From long experience working on robot vision systems, I know that similar edge or motion detection, if performed by efficient software, requires the execution of at least 100 computer instructions. Therefore, to accomplish the retina’s 10 million detections per second would necessitate at least 1,000 MIPS.

The entire human brain is about 75,000 times heavier than the 0.02 gram of processing circuitry in the retina, which implies that it would take, in round numbers, 100 million MIPS (100 trillion instructions per second) to emulate the 1,500-gram human brain. Personal computers in 2008 are just about a match for the 0.1-gram brain of a guppy, but a typical PC would have to be at least 10,000 times more powerful to perform like a human brain.

Brainpower and Utility Though dispiriting to artificial-intelligence experts, the huge deficit does not mean that the goal of a humanlike artificial brain is unreachable. Computer power for a given price doubled each year in the 1990s, after doubling every 18 months in the 1980s and every two years before that. Prior to 1990 this progress made possible a great decrease in the cost and size of robot-controlling computers. Cost went from many millions of dollars to a few thousand, and size went from room-filling to handheld. Power, meanwhile, held steady at about 1 MIPS. Since 1990 cost and size reductions have abated, but power has risen to about 10,000 MIPS for a home computer. At the present pace, only about 20 or 30 years will be needed to close the gap. Better yet, useful robots don’t need full human-scale brainpower.

Commercial and research experiences convince me that the mental power of a guppy—about 10,000 MIPS—will suffice to guide mobile utility robots reliably through unfamiliar surroundings, suiting them for jobs in hundreds of thousands of industrial locations and eventually hundreds of millions of homes. A few machines with 10,000 MIPS are here already, but most industrial robots still use processors with less than 1,000 MIPS.

Commercial mobile robots have found few jobs. A paltry 10,000 work worldwide, and the companies that made them are struggling or defunct. (Makers of robot manipulators are not doing much better.) The largest class of commercial mobile robots, known as automatic guided vehicles (AGVs), transport materials in factories and warehouses. Most follow buried signal-emitting wires and detect end points and collisions with switches, a technique developed in the 1960s.

It costs hundreds of thousands of dollars to install guide wires under concrete floors, and the routes are then fixed, making the robots economical only for large, exceptionally stable factories. Some robots made possible by the advent of microprocessors in the 1980s track softer cues, like magnets or optical patterns in tiled floors, and use ultrasonics and infrared proximity sensors to detect and negotiate their way around obstacles.

The most advanced industrial mobile robots, developed since the late 1980s, are guided by occasional navigational markers—for instance, laser-sensed bar codes—and by preexisting features such as walls, corners and doorways. The costly labor of laying guide wires is replaced by custom software that is carefully tuned for each route segment. The small companies that developed the robots discovered many industrial customers eager to automate transport, floor cleaning, security patrol and other routine jobs. Alas, most buyers lost interest as they realized that installation and route changing required time-consuming and expensive work by experienced route programmers of inconsistent availability. Technically successful, the robots fizzled commercially.

In failure, however, they revealed the essentials for success. First, the physical vehicles for various jobs must be reasonably priced. Fortunately, existing AGVs, forklift trucks, floor scrubbers and other industrial machines designed for accommodating human riders or for following guide wires can be adapted for autonomy. Second, the customer should not have to call in specialists to put a robot to work or to change its routine; floor cleaning and other mundane tasks cannot bear the cost, time and uncertainty of expert installation. Third, the robots must work reliably for at least six months before encountering a problem or a situation requiring downtime for reprogramming or other alterations. Customers routinely rejected robots that after a month of flawless operation wedged themselves in corners, wandered away lost, rolled over employees’ feet or fell down stairs. Six months, though, earned the machines a sick day.

Robots exist that have worked faultlessly for years, perfected by an iterative process that fixes the most frequent failures, revealing successively rarer problems that are corrected in turn. Unfortunately, that kind of reliability has been achieved only for prearranged routes. An insectlike 10 MIPS is just enough to track a few handpicked landmarks on each segment of a robot’s path. Such robots are easily confused by minor surprises such as shifted bar codes or blocked corridors (not unlike ants thrown off a scent trail or a moth that has mistaken a streetlight for the moon).

A Sense of Space Robots that chart their own routes emerged from laboratories worldwide in the mid-1990s, as microprocessors reached 100 MIPS. Most build two-dimensional maps from sonar or laser range­finder scans to locate and route themselves, and the best seem able to navigate office hallways for days before becoming disoriented. Of course, they still fall far short of the six-month commercial criterion. Too often different locations in the coarse maps resemble one another. Conversely, the same location, scanned at different heights, looks different, or small obstacles or awkward protrusions are overlooked. But sensors, computers and techniques are improving, and success is in sight.

My efforts are in the race. In the 1980s at Carnegie Mellon we devised a way to distill large amounts of noisy sensor data into reliable maps by accumulating statistical evidence of emptiness or occupancy in each cell of a grid representing the surroundings. The approach worked well in two dimensions and still guides many of the robots described above.

Three-dimensional maps, 1,000 times richer, promised to be much better but for years seemed computationally out of reach. In 1992 we used economies of scale and other tricks to reduce the computational costs of three-dimensional maps 100-fold. Continued research led us to found a company, Seegrid, that sold its first dozen robots by late 2007. These are load-pulling warehouse and factory “tugger” robots that, on command, autonomously follow routes learned in a single human-guided walk-through. They navigate by three-dimensionally grid-mapping their route, as seen through four wide-angle stereoscopic cameras mounted on a “head,” and require no guide wires or other navigational markers.

Robot, Version 1.0 In 2008 desktop PCs offer more than 10,000 MIPS. Seegrid tuggers, using slightly older processors doing about 5,000 MIPS, distill about one visual “glimpse” per second. A few thousand visually distinctive patches in the surroundings are selected in each glimpse, and their 3-D positions are statistically estimated. When the machine is learning a new route, these 3-D patches are merged into a chain of 3-D grid maps describing a 30-meter “tunnel” around the route. When the tugger is automatically retracing a taught path, the patches are compared with the stored grid maps. With many thousands of 3-D fuzzy patches weighed statistically by a so-called sensor model, which is trained offline using calibrated example routes, the system is remarkably tolerant of poor sight, changes in lighting, movement of objects, mechanical inaccuracies and other perturbations.

Seegrid’s computers, perception programs and end products are being rapidly improved and will gain new functionalities such as the ability to find, pick up and drop loads. The potential market for materials-handling automation is large, but most of it has been inaccessible to older approaches involving buried guide wires or other path markers, which require extensive planning and installation costs and create inflexible routes. Vision-guided robots, on the other hand, can be easily installed and rerouted.

Fast Replay Plans are afoot to improve, extend and miniaturize our techniques so that they can be used in other applications. On the short list are consumer robot vacuum cleaners. Externally these may resemble the widely available Roomba machines from iRobot. The Roomba, however, is a simple beast that moves randomly, senses only its immediate obstacles and can get trapped in clutter. A Seegrid robot would see, explore and map its premises and would run unattended, with a cleaning schedule minimizing owner disturbances. It would remember its recharging locations, allowing for frequent recharges to run a powerful vacuum motor, and also would be able to frequently empty its dust load into a larger container.

Commercial success will provoke competition and ac­celerate investment in manufacturing, engineering and research. Vacuuming robots ought to beget smarter cleaning robots with dusting, scrubbing and picking-up arms, followed by larger multifunction utility robots with stronger, more dexterous arms and better sensors. Programs will

be written to make such machines pick up clutter, store, retrieve and deliver things, take inventory, guard homes, open doors, mow lawns, play games, and so on. New applications will expand the market and spur further advances when robots fall short in acuity, precision, strength, reach, dexterity, skill or processing power. Capability, numbers sold, engineering and manufacturing quality, and cost-effectiveness will increase in a mutually reinforcing spiral. Perhaps by 2010 the process will have produced the first broadly competent “universal robots,” as big as people but with lizardlike 20,000-MIPS minds that can be programmed for almost any simple chore.

Like competent but instinct-ruled reptiles, first-generation universal robots will handle only contingencies explicitly covered in their application programs. Unable to adapt to changing circumstances, they will often perform inefficiently or not at all. Still, so much physical work awaits them in businesses, streets, fields and homes that robotics could begin to overtake pure information technology commercially.

A second generation of universal robot with a mouselike 100,000 MIPS will adapt as the first generation does not and will even be trainable. Besides application programs, such robots would host a suite of software “conditioning modules” that would generate positive and negative reinforcement signals in pre­de­fined circumstances. For example, doing jobs fast and keeping its batteries charged will be positive; hitting or breaking something will be negative. There will be other ways to accomplish each stage of an application program, from the minutely specific (grasp the handle underhand or overhand) to the broadly general (work indoors or outdoors). As jobs are repeated, alternatives that result in positive reinforcement will be favored, those with negative outcomes shunned. Slowly but surely, second-generation robots will work increasingly well.

A monkeylike five million MIPS will permit a third generation of robots to learn very quickly from mental rehearsals in simulations that model physical, cultural and psychological factors. Physical properties include shape, weight, strength, texture and appearance of things, and ways to handle them. Cultural aspects include a thing’s name, value, proper location and purpose. Psychological factors, applied to humans and robots alike, include goals, beliefs, feelings and preferences. Developing the simulators will be a huge undertaking involving thousands of programmers and experience-gathering robots. The simulation would track external events and tune its models to keep them faithful to reality. It would let a robot learn a skill by imitation and afford a kind of consciousness. Asked why there are candles on the table, a third-generation robot might consult its simulation of house, owner and self to reply that it put them there because its owner likes candlelit dinners and it likes to please its owner. Further queries would elicit more details about a simple inner mental life concerned only with concrete situations and people in its work area.

Fourth-generation universal robots with a humanlike 100 million MIPS will be able to abstract and generalize. They will result from melding powerful reasoning programs to third-generation machines. These reasoning programs will be the far more sophisticated descendants of today’s theorem provers and expert systems, which mimic human reasoning to make medical diagnoses, schedule routes, make financial decisions, con­figure computer systems, analyze seismic data to locate oil deposits, and so on.

Properly educated, the resulting robots will become quite formidable. In fact, I am sure they will outperform us in any conceivable area of endeavor, intellectual or physical. Inevitably, such a development will lead to a fundamental restructuring of our society. Entire corporations will exist without any human employees or investors at all. Humans will play a pivotal role in formulating the intricate complex of laws that will govern corporate behavior. Ultimately, though, it is likely that our descendants will cease to work in the sense that we do now. They will probably occupy their days with a variety of social, recreational and artistic pursuits, not unlike today’s comfortable retirees or the wealthy leisure classes.

The path I’ve outlined roughly recapitulates the evolution of human intelligence—but 10 million times more rapidly. It suggests that robot intelligence will surpass our own well before 2050. In that case, mass-produced, fully educated robot scientists working diligently, cheaply, rapidly and increasingly effectively will ensure that most of what science knows in 2050 will have been discovered by our artificial progeny!

6 TED Talks About Robotics Everyone Should Watch

July 22, 2019

We hear it all the time- “Robots are taking over our jobs.” Oh, you mean the 2 million unfilled manufacturing jobs that are estimated in the next 10 years? Believe it or not, we’re entering an era that, according to  Deloitte and Manufacturing Institute , we’re not only losing skilled workers due to aging but having difficulty finding skilled workers because people have no desire to work in these positions.

One of the most adopted solutions to the labor crisis that is affecting all industries is to start using our power and ability to create a world where humans allow robots to help with jobs that are dangerous, undesired, or tedious. And then go beyond that. Think about what other areas robots can help with aside from manufacturing. Therapy, inaccessible or undiscovered areas, emergency-prevention, the opportunities are endless.

Robotic automation has allowed us to meet these goals and will keep inspiring all industries to go farther. We’ve compiled a list of the 6 must-see TED Talks from thought leaders in robotics and automation and what we can learn from each.

1.  Meet the Robot Designed Like an Octopus Tentacle

Author : Carl Vause

Premise:  Think about everything you do with your hands on a daily basis. You hold your phone, pick up your keys, grasp a water bottle. None of this possible without the complex system that makes up your body- from your nimble fingers to your brain. Robotic dexterity has now expanded into a world that was previously non-dexterous. Why is this so important? Because these advances can now automate tasks in factories and industries that are facing labor constraints, and/or where it’s dangerous for humans to work.

What we learned/future use: Soft robotic actuators are opening up an entirely new market to handle products that were before impossible to handle via robotic automation- delicate produce and eggs, bakery items like croissants and donuts, tiny injection-molded pieces, and so much more.

2.  The Incredible Potential of Flexible Soft Robots

Author : Giana Gerboni

Premise: Giana Gerboni starts this off by explaining the design of standard rigid robots, and why we need to think about the opposite approach, better known as soft robots. Her approach is built on embodied intelligence, which is a trait learned from nature that provides the ability to adapt to the environment. This is a useful skill in almost every field in the world. For example, Giana works in the biomedical industry and her soft approach to the way we view and use robots in the medical field. She speaks to one of her past projects which uses soft robots in surgery as a minimally invasive solution and expands the view-space in the body.

What we learned/future use : Soft robots perform well in situations that require flexibility, adaptability, and repeatability. So it’s important to view their abilities not as an alternative to standard robots, but as an inspiration to go beyond what standard robots can normally do.

3. Why We Have an Emotional Connection to Robots

Author : Kate Darling

Premise: Have you ever seen one of those animal robots that have fallen over and you feel bad that they may be hurt? Come to realize they are not a living, breathing animal and you get confused why that made you emotional in the first place. Welcome to Kate Darling’s TED Talk. In a nutshell, it is a simple human instinct to feel connections towards almost anything, more so something that can move on its own.

What we learned/future use:  Robots aren’t designed specifically to evoke a connection with humans. They are designed to perform tasks that humans don’t want to do, or take over tasks that humans simply can’t do. However, with our new approach to human-robot interaction, we may think of new ways to use robots as therapeutic means for those in nursing homes, people living with disabilities, education, and much more.

While you may not necessarily understand the reasoning behind your connection to a robot, and I’m not saying you should  always  have a connection towards  all  robots because that’s just weird, you could use it as an opportunity to explore your approach to real-life situations and open up your own ability to be empathetic to what’s around you. Maybe what this means for working professionals is that building connections with robots will allow them to adapt and adopt to robots in the workplace and this “relationship” will enable better work on your end.

4.  Meet Spot the Robot Dog That Can Run Jump and Open Doors

Author : Marc Raibert

Premise : Disclaimer, this dog is not a pet. Marc Raibert at Boston Dynamics builds his robots to the following three goals: mobility, dexterity, and perception. We see that the robotic dogs, aside from being strangely cute and life-like, can perform human-like tasks such as opening doors. But the most impressive programming in the robot speaks to its ability to handle harsh environments with ease. It stays balanced when being pushed over, it can tread through 10-inch deep snow with little resistance, and it can tackle other obstacles like hills, stairs, etc.

What we learned/future use: With Marc’s three main goals in mind, one day we could be able to take these robots to areas that are inaccessible to humans and learn about the environment. We could use these robots to deliver packages or in defense. What we will continue to wonder is how safe are these robots to interact with humans? These robots are programmed to perform certain tasks, but how will they know when to stop if in a dangerous situation? What are the ramifications and how can we prevent this from happening?

5.  The Artificial Muscles That Will Power Robots in the Future

Author : Christoph Keplinger

Premise:  Another fantastic, yet somewhat different, example of how soft robotics makes a splash in the robotics ocean. “Actuators are for robots, what muscles are for animals.” So, building soft robots, soft actuators, and soft muscles can help adapt to changing surroundings. Bonus, he even teaches you how to replicate a similar application at home, so try it out and let us know how it worked.

What we learned/future use:  While these artificial muscles are relatively new to the world, we won’t be surprised if someday prosthetics are using them as the main structure to mimic the human body and perform human tasks.

6.  Why You Should Make Useless Things

Author : Simone Giertz

Premise:  If you’re looking for a TED Talk with a little personality and a few good laughs, that also might confuse you after reading the title, look no further. Simone Giertz is an average young professional who went from having performance anxiety to outperforming all by eliminating her expectations towards success and simply inventing things that will fail. Her experience in reality and living with failure is truly an inspiration to those who constantly find themselves unhappy with their own work. Maybe this will guide you to see things differently.

What we learned/future use: Sometimes not knowing what the answer isn’t always a bad thing. In the world of engineers, inventors, creators, etc. it seems as though you are expected to know the answer immediately. Sometimes even expected to know the  right answer immediately. By taking what Simone has proven is that taking time to explore and find your enthusiasm for useless work can create something bigger. Now, there are plenty of million-dollar robots out there that have failed terribly, but the main issue is because they aren’t solving a problem. If we learn anything from Simone, it’s to find a solution to a problem and create a machine that can solve for it, even if it may be silly and undeveloped, and not just create an utterly useless machine with no intent.

Back To All Resources

Related content, 3d vision | artificial intelligence | food safety, 5 ways soft robotics can help machine builders eliminate stringent infeed requirements.

By labs on July 22, 2019

3D Imaging Advances Speed, Food Processing Automation Growth

3d vision | meat and poultry, automation advances address protein processing challenges.

Advertisement

Supported by

Don’t Fear the Robots, and Other Lessons From a Study of the Digital Economy

A task force assembled by M.I.T. examined how technology has changed, and will change, the work force.

  • Share full article

a speech about robots

By Steve Lohr

L. Rafael Reif, the president of Massachusetts Institute of Technology, delivered an intellectual call to arms to the university’s faculty in November 2017: Help generate insights into how advancing technology has changed and will change the work force, and what policies would create opportunity for more Americans in the digital economy.

That issue, he wrote, is the “defining challenge of our time.”

Three years later, the task force assembled to address it is publishing its wide-ranging conclusions. The 92-page report , “The Work of the Future: Building Better Jobs in an Age of Intelligent Machines,” was released on Tuesday.

The group is made up of M.I.T. professors and graduate students, researchers from other universities, and an advisory board of corporate executives, government officials, educators and labor leaders. In an extraordinarily comprehensive effort, they included labor market analysis, field studies and policy suggestions for changes in skills-training programs, the tax code, labor laws and minimum-wage rates.

Here are four of the key findings in the report:

Most American workers have fared poorly.

It’s well known that those on the top rungs of the job ladder have prospered for decades while wages for average American workers have stagnated. But the M.I.T. analysis goes further. It found, for example, that real wages for men without four-year college degrees have declined 10 to 20 percent since their peak in 1980. (Two-thirds of American workers do not have four-year college degrees.)

The U.S. economy produces larger wage gaps, proportionately fewer high-quality jobs and less intergenerational mobility than most other developed nations do, the researchers found. And America does not seem to get a compensating payoff in growth. “The U.S. is getting a low ‘return’ on its inequality,” the report said.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Sustainability
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

Helping robots practice skills independently to adapt to unfamiliar environments

Press contact :.

Four panels illustrate a quadrupedal robot sweeping with a broom and moving some torus-shaped objects

Previous image Next image

The phrase “practice makes perfect” is usually reserved for humans, but it’s also a great maxim for robots newly deployed in unfamiliar environments.

Picture a robot arriving in a warehouse. It comes packaged with the skills it was trained on, like placing an object, and now it needs to pick items from a shelf it’s not familiar with. At first, the machine struggles with this, since it needs to get acquainted with its new surroundings. To improve, the robot will need to understand which skills within an overall task it needs improvement on, then specialize (or parameterize) that action.

A human onsite could program the robot to optimize its performance, but researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and The AI Institute have developed a more effective alternative. Presented at the Robotics: Science and Systems Conference last month, their “Estimate, Extrapolate, and Situate” (EES) algorithm enables these machines to practice on their own, potentially helping them improve at useful tasks in factories, households, and hospitals.  Sizing up the situation

To help robots get better at activities like sweeping floors, EES works with a vision system that locates and tracks the machine’s surroundings. Then, the algorithm estimates how reliably the robot executes an action (like sweeping) and whether it would be worthwhile to practice more. EES forecasts how well the robot could perform the overall task if it refines that particular skill, and finally, it practices. The vision system subsequently checks whether that skill was done correctly after each attempt.

EES could come in handy in places like a hospital, factory, house, or coffee shop. For example, if you wanted a robot to clean up your living room, it would need help practicing skills like sweeping. According to Nishanth Kumar SM ’24 and his colleagues, though, EES could help that robot improve without human intervention, using only a few practice trials.

“Going into this project, we wondered if this specialization would be possible in a reasonable amount of samples on a real robot,” says Kumar, co-lead author of a paper describing the work, PhD student in electrical engineering and computer science, and a CSAIL affiliate. “Now, we have an algorithm that enables robots to get meaningfully better at specific skills in a reasonable amount of time with tens or hundreds of data points, an upgrade from the thousands or millions of samples that a standard reinforcement learning algorithm requires.”

See Spot sweep

EES’s knack for efficient learning was evident when implemented on Boston Dynamics’ Spot quadruped during research trials at The AI Institute. The robot, which has an arm attached to its back, completed manipulation tasks after practicing for a few hours. In one demonstration, the robot learned how to securely place a ball and ring on a slanted table in roughly three hours. In another, the algorithm guided the machine to improve at sweeping toys into a bin within about two hours. Both results appear to be an upgrade from previous frameworks, which would have likely taken more than 10 hours per task. “We aimed to have the robot collect its own experience so it can better choose which strategies will work well in its deployment,” says co-lead author Tom Silver SM ’20, PhD ’24, an electrical engineering and computer science (EECS) alumnus and CSAIL affiliate who is now an assistant professor at Princeton University. “By focusing on what the robot knows, we sought to answer a key question: In the library of skills that the robot has, which is the one that would be most useful to practice right now?”

EES could eventually help streamline autonomous practice for robots in new deployment environments, but for now, it comes with a few limitations. For starters, they used tables that were low to the ground, which made it easier for the robot to see its objects. Kumar and Silver also 3D printed an attachable handle that made the brush easier for Spot to grab. The robot didn’t detect some items and identified objects in the wrong places, so the researchers counted those errors as failures.

Giving robots homework

The researchers note that the practice speeds from the physical experiments could be accelerated further with the help of a simulator. Instead of physically working at each skill autonomously, the robot could eventually combine real and virtual practice. They hope to make their system faster with less latency, engineering EES to overcome the imaging delays the researchers experienced. In the future, they may investigate an algorithm that reasons over sequences of practice attempts instead of planning which skills to refine. “Enabling robots to learn on their own is both incredibly useful and extremely challenging,” says Danfei Xu, an assistant professor in the School of Interactive Computing at Georgia Tech and a research scientist at NVIDIA AI, who was not involved with this work. “In the future, home robots will be sold to all sorts of households and expected to perform a wide range of tasks. We can't possibly program everything they need to know beforehand, so it’s essential that they can learn on the job. However, letting robots loose to explore and learn without guidance can be very slow and might lead to unintended consequences. The research by Silver and his colleagues introduces an algorithm that allows robots to practice their skills autonomously in a structured way. This is a big step towards creating home robots that can continuously evolve and improve on their own.” Silver and Kumar’s co-authors are The AI Institute researchers Stephen Proulx and Jennifer Barry, plus four CSAIL members: Northeastern University PhD student and visiting researcher Linfeng Zhao, MIT EECS PhD student Willie McClinton, and MIT EECS professors Leslie Pack Kaelbling and Tomás Lozano-Pérez. Their work was supported, in part, by The AI Institute, the U.S. National Science Foundation, the U.S. Air Force Office of Scientific Research, the U.S. Office of Naval Research, the U.S. Army Research Office, and MIT Quest for Intelligence, with high-performance computing resources from the MIT SuperCloud and Lincoln Laboratory Supercomputing Center.

Share this news article on:

Related links.

  • Project website
  • Leslie Pack Kaelbling
  • Nishanth Kumar
  • Tomás Lozano-Pérez
  • Computer Science and Artificial Intelligence Laboratory (CSAIL)
  • Department of Electrical Engineering and Computer Science

Related Topics

  • Computer science and technology
  • Artificial intelligence
  • Computer vision
  • Quest for Intelligence
  • Electrical Engineering & Computer Science (eecs)
  • National Science Foundation (NSF)

Related Articles

With a thought bubble containing a to-do list over its head, a robotic arm begins to complete kitchen tasks in three panels: opening a microwave, opening and closing a cupboard door, and placing a pot on a stove.

Multiple AI models help robots execute complex plans more transparently

Two by two grid of images. At top left, a large robotic arm with objects it can pick up, including a white doll, a banana, multicolored building blocks, and green grapes. The other three panels show the same demonstration setup in different heat signatures.

Using language to give robots a better grasp of an open-ended world

Rendering shows a grid of robot arms with a box in front of each one. Each robot arm is grabbing objects nearby, like sunglasses and plastic containers, and putting them inside a box.

New technique helps robots pack objects into a tight space

Illustration with four panels shows 3D models of robots performing various tasks. From top left: a pan of potatos on top of stove, with a robot holding a food item. To the right is a robot standing in front of a cabinent. In the bottom left, a robot stands in front of a full kitchen and reaches for a pot. In the bottom right quadrant, a robotic gripper reaches into the sink for an item, next to two water bottles.

AI helps household robots cut planning time in half

Previous item Next item

More MIT News

Illustration of a woman with a coffee pot approaching a man with a drinking glass. Both have thought bubbles regarding their intention to fill the glass with coffee. In the background, a robot has a speech bubble with the “no” symbol.

AI assistant monitors teamwork to promote effective collaboration

Read full story →

Stack of legal papers and gavel

MIT study explains why laws are written in an incomprehensible style

So Yeon Kim, wearing lab gear, holds up an out-of-focus object in her gloved hands for Ju Li's inspection

More durable metals for fusion power reactors

Dominika Ďurovčíková stands in front of a giant photo of a galaxy.

When the lights turned on in the universe

Isometric drawing shows rows of robots on phones, and in the middle is a human looking up.

3 Questions: How to prove humanity online

Rachael Rosco and Brandon Sun face one another across a desk strewn with various tools and components

Lincoln Laboratory and National Strategic Research Institute launch student research program to tackle biothreats to national security

  • More news on MIT News homepage →

Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA

  • Map (opens in new window)
  • Events (opens in new window)
  • People (opens in new window)
  • Careers (opens in new window)
  • Accessibility
  • Social Media Hub
  • MIT on Facebook
  • MIT on YouTube
  • MIT on Instagram

a speech about robots

  • Study at Cambridge
  • About the University
  • Research at Cambridge
  • Colleges and departments
  • Email and phone search
  • For business
  • For current students
  • Libraries and facilities
  • Museum and collections

Search form

  • Events and open days
  • Fees and finance
  • Student blogs and videos
  • Why Cambridge
  • Qualifications directory
  • How to apply
  • Fees and funding
  • Frequently asked questions
  • International students
  • Continuing education
  • Executive and professional education
  • Course in education
  • Giving to Cambridge
  • How the University and Colleges work
  • Visiting the University
  • Spotlight on...
  • About research at Cambridge

Department of Engineering

  • Overview of the Department
  • 21st Century Engineers
  • Staff and Student Directory
  • Department Newsletter
  • Alumni Relations
  • How to Find Us
  • Keep in touch
  • Undergraduates Overview
  • Prospective Undergraduates
  • Information for Staff
  • Current Undergraduates
  • Postgraduates Overview
  • Taught courses (MPhil and MRes)
  • Centres for Doctoral Training (CDTs)
  • PhD in Engineering
  • MPhil in Engineering
  • Part-time study
  • Applying for taught courses and CDTs
  • Applying for research courses
  • Applying for part-time study
  • Requirements for postgraduate students
  • English language requirements
  • International equivalencies
  • Funding opportunities for applicants
  • Current Postgraduate Students
  • Information for staff
  • Research Overview
  • Energy, Fluids and Turbomachinery
  • Electrical Engineering
  • Mechanics, Materials and Design
  • Civil Engineering
  • Manufacturing and Management
  • Information Engineering
  • Energy, Transport and Urban Infrastructure
  • Manufacturing, Design and Materials
  • Bioengineering
  • Complex, Resilient and Intelligent Systems
  • Research news
  • Research Integrity
  • Collaboration Overview
  • Student Placements
  • Short Student Projects
  • Longer Projects and Frameworks
  • Academic Partnerships
  • Consulting and Other Services
  • Giving to the Department
  • Events and Outreach Overview
  • Events and Seminars
  • Work Experience at the Department of Engineering
  • Services Overview
  • Building and Estate Services
  • Design & Technical Services
  • Health and Safety
  • Printing Services
  • Centre for Languages and Inter-Communication

We ask the experts: Will robots take over the world?

a speech about robots

Robots can do a lot for us: they can explore space or they can cut our toenails. But do advances in robotics and artificial intelligence hold hidden threats? Three leaders in their fields answer questions about our relationships with robots.

I think it is reasonable to be concerned that we may reach a time when robotic intelligence outstrips human intelligence. Professor Daniel Wolpert

The origins of robotics go back to the automata invented by ancient civilisations. The word robot entered our vocabulary only in 1920 with Czech writer Karel Čapek’s play R.U.R (Rossum's Universal Robots). Over the past 20 years robots have been developed to work in settings that range from manufacturing industry to space. At Cambridge University, robotics is a rapidly developing field within many departments, from theoretical physics and computing to engineering and medical science.

Lord Martin Rees is Emeritus Professor of Cosmology and Astrophysics at the University of Cambridge. He holds the honorary title of Astronomer Royal. Lord Rees is co-founder of the Centre for the Study of the Existential Risk, an early stage initiative which brings together a scientist, philosopher and software entrepreneur. Kathleen Richardson is an anthropologist of robots. She took her PhD at Cambridge and recently completed a postdoctoral fellowship at UCL. She is writing a book that explores the representational models used by scientists and how they influence ideas we have about robots as potential friends or enemies. Daniel Wolpert is a Royal Research Society Professor in the Department of Engineering. His expertise lies in bioengineering and especially the mechanisms that control interactions between brain and body. The focus of his research group is an understanding of movement, which he believes is central to all human activities.

What can robots do for us?

Martin Rees: I think robots have two very different roles. The first is to operate in locations that humans can't reach, such as the aftermaths of accidents in mines, oil-rigs and nuclear power stations. The second, also deeply unglamorous, is to help elderly or disabled people with everyday life: tying shoelaces, cutting toenails and suchlike. Moreover, if robots can be miniaturised, they can perhaps be used inside our bodies for monitoring our health, undertaking surgery, and so forth.

Kathleen Richardson: Some of the roles that robots are expected to play are because we cannot do them as humans - for example, to explore outer space. Space exploration is an area where robots are helpful. Robots can be remote and act as extended 'eyes' for humans, enabling us to look beyond our visual experience into terrains that are inhospitable to us. Other roles that robots are expected to perform are roles that humans can play, such as helping the elderly or the infirm. Unfortunately these roles are not best suited to machines, but to other people. So the question is: why would we prefer a machine do them for us?

Daniel Wolpert: While computers can now beat grandmasters at chess, there is currently no robot that can match the dexterity of a five-year-old child. The field of robotics is similar to where computers were in the 1960s - expensive machines used in simple, repetitive industrial processes. But modern day robotics is changing that. Robots are likely to become as ubiquitous as the smartphone computers we all carry - from microscopic robotics for healthcare and fabrication to human-size robots to take on our everyday tasks or even act as companions.

How soon will machine intelligence outstrip human intelligence?

MR: Up till now, the advances have been patchy. For at least the last 30 years, we've been able to buy for a few pounds a machine that can do arithmetic faster than our brains can. Back in the 1990s IBM's 'Deep Blue' beat Kasparov, the world chess champion. And more recently a computer called 'Watson' beat human challengers in a verbal quiz game on television. But robots are still limited in their ability to sense their environment: they can't yet recognise and move the pieces on a real chessboard as cleverly as a child can. Later this century, however, their more advanced successors may relate to their surroundings (and to people) as adeptly as we do. Moral questions then arise. We accept an obligation to ensure that other human beings, and indeed some animal species, can fulfil their 'natural' potential. So what's our obligation towards sophisticated robots? Should we feel guilty about exploiting them? Should we fret if they are underemployed, frustrated, or bored?

KR: As an anthropologist, I question the idea of 'objective' human intelligence. There are just cultural measures about what intelligence is and therefore machines could outstrip 'human intelligence'. When that happens will depend on what we decide is the measure of intelligence. Each generation makes a new definition of what it means to be human and what is uniquely a human quality, then a machine comes along and meets it and so many people despair that humanity is on the brink of its own annihilation. This fear of machines is not something inherent in them, it is a consequences of the modes of mimesis (copying and representation) used in the making of robots. This could be seen as a modern form of animism. Animism is a term to describe the personification of nature, but I believe we can apply it to machines. Human beings personify just about everything: we see faces in clouds, mystical impressions in Marmite and robots as an autonomous threat. The human fear of robots and machines arguably has much more to say about human fear of each other rather than anything inherently technical in the machines. However, one of the consequences of thinking that the problem lies with machines is that as a culture we tend to imagine they are greater and more powerful than they really are and subsequently they become so.

DW: In a limited sense it already has. Machines can already navigate, remember and search for items with an ability that far outstrips humans. However, there is no machine that can identify visual objects or speech with the reliability and flexibility of humans. These abilities are precursors to any real intelligence such as the ability to reason creatively and invent new problems. Expecting a machine close to the creative intelligence of a human within the next 50 years would be highly ambitious.

Should we be scared by advances in artificial intelligence?

MR: Those who should be worried are the futurologists who believe in the so-called 'singularity', when robots take over and themselves create even more sophisticated progeny. And another worry is that we are increasingly dependent on computer networks, and that these could behave like a single 'brain' with a mind of its own, and with goals that may be contrary to human welfare. I think we should ensure that robots remain as no more than 'idiot savants' - lacking the capacity to outwit us, even though they may greatly surpass us in the ability to calculate and process information.

KR: We need to ask why fears of artificial intelligence and robots persist; none have in fact risen up and challenged human supremacy. To understand what underscores these fears, we need to understand science and technology as having a particular and exclusionary kind of mimesis. Mimesis is the way we copy and imitate. In creating artificial intelligence machines and robots we are copying the human. Part of what we copy is related to the psychic world of the maker, and then the maker is copying ideas, techniques and practices into the machine that are given by the cultural spirit (the science, technology, and life) of the moment. All these factors are fused together in the making of artificial intelligence and robots. So we have to ask why it is also so frightening to make this copy? Not all fear a robotic uprising; many people welcome machine intelligence and see it as wonderful opportunity to create a new life. So to understand why some fear and some embrace you really have to know what models of mimesis go into the making of robots.

DW: We have already seen the damaging effects of simplest forms of artificial self-replicating intelligence in the form of computer viruses. But in this case, the real intelligence is the malicious designer. Critically, the benefits of computers outweigh the damage that computer viruses cause. Similarly, while there may be misuses of robotics in the near future, the benefits that they will bring are likely to outweigh these negative aspects. I think it is reasonable to be concerned that we may reach a time when robotic intelligence outstrips humans' and robots are able to design and produce robots more advanced than themselves.

Should robots be used to colonise other planets?

MR: By the end of the century, the entire solar system -- planets, moons and asteroids -- will be explored and mapped by flotillas of tiny robotic craft. The next step would be mining of asteroids, enabling fabrication of large structures in space without having to bring all the raw materials from Earth. It would be possible to develop huge artefacts: giant telescopes with gossamer-thin mirrors assembled under zero gravity, collectors of solar energy, and so forth. I think this is more realistic and benign than the so-called 'terraforming' of planets - which should be preserved with a status that is analogous to Antarctica here on Earth (at least until we are sure that there is no form of life already there).

KR: I am not happy with the word 'colonise' for humans or robots. Europeans colonised other peoples' lands and left a long legacy of enslavement, problems, disease and, for many, suffering. I think whether we do something on Earth or on Mars we should always do it in the spirit of a genuine interest in 'the-Other', not to impose a particular model, but to meet 'the-Other'. Robots could help us to go to places we cannot physically go ourselves, but these robots cannot interpret what they are seeing for us.

DW: I don't see a pressing need to colonise other planets unless we can bring resources back to Earth. The vast majority of Earth is currently inaccessible to us. Using robots to gather resources nearer to home would seem to be a better use of our robotic tools.

What can science fiction tell us about robotics?

MR: I sometimes advise students that it's better to read first-rate science fiction than second-rate science -- more stimulating, and perhaps no more likely to be wrong. Even those of us who don't buy the idea of a singularity by mid-century would expect sustained, if not enhanced, rate of innovation in biotech, nanotech and in information science. I think there will be robotic entities with superhuman intellect within a few centuries. Post-human intelligence (whether in organic form, or in autonomously-evolving artefacts) will develop hyper-computers with the processing power to simulate living things, even entire worlds. Perhaps advanced beings could use hyper-computers to surpass the best 'special effects' in movies or computer games so vastly that they could simulate a world fully as complex as the one we perceive ourselves to be in. Maybe these kinds of super-intelligences already exist elsewhere in the universe.

KR: Fiction and science fiction is so important for everyday life. In Western culture we tend to think there is reality on the one hand, and fiction and fantasy on the other. This separation does not exist in all cultures, but science and technologists made this deliberate separation because they wanted to carve out the sphere of their work. In doing this they denigrated lots of valuable knowledge, such as myth and metaphor, that might be important in developing a richer model. But the divide is not so clear cut and that is why the worlds seem to collide at times. In some cases we need to bring these different understandings together to get a whole perspective. Perhaps then, we won't be so frightened that something we create as a copy of ourselves will be so threatening to us.

DW: Science fiction has often been remarkable at predicting the future - from Arthur C Clarke's idea of satellite communication to Star Trek's communicators which now look old fashioned compared to modern mobile phones. Science fiction has painted a vivid spectrum of possible futures, from cute and helpful robots (Star Wars) to dystopian (I Robot) robotic societies. Interestingly, almost no science fiction envisages a future without robots.

The text in this work is licensed under a  Creative Commons Attribution-NonCommercial-ShareAlike 4. 0 International License . Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways that permit your use and sharing of our content under their respective Terms.

a speech about robots

Advertisement

Robot debates humans about the dangers of artificial intelligence

By Donna Lu

24 November 2019

Project Debater

Project Debater argued both for and against the benefits of artificial intelligence

An artificial intelligence has debated the dangers of AI – narrowly convincing audience members that the technology will do more good than harm.

Project Debater, a robot developed by IBM, spoke on both sides of the argument, with two human teammates for each side helping it out. Talking in a female American voice to a crowd at the University of Cambridge Union on Thursday evening, the AI gave each side’s opening statements, using arguments drawn from more than 1100 human submissions made ahead of time.

Find out more about Artificial Intelligence at our AI Instant Expert event in London

On the proposition side, arguing that AI will bring more harm than good, Project Debater’s opening remarks were darkly ironic. “AI can cause a lot of harm,” it said. “AI will not be able to make a decision that is the morally correct one, because morality is unique to humans.”

“AI companies still have too little expertise on how to properly assess datasets and filter out bias,” it added. “AI will take human bias and will fixate it for generations.”

Read more: The hardest thing about robots? Teaching them to cope with us

The AI used an application known as “speech by crowd” to generate its arguments, analysing submissions people had sent in online. Project Debater then sorted these into key themes, as well as identifying redundancy – submissions making the same point using different words.

Project Debater

Project Debater summarised arguments put forward by humans

The AI argued coherently but had a few slip-ups. Sometimes it repeated itself – while talking about the ability of AI to perform mundane and repetitive tasks, for example – and it didn’t provide detailed examples to support its claims.

While debating on the opposition side, which was advocating for the overall benefits of AI, Project Debater argued that AI would create new jobs in certain sectors and “bring a lot more efficiency to the workplace”.

But then it made a point that was counter to its argument: “AI capabilities caring for patients or robots teaching schoolchildren – there is no longer a demand for humans in those fields either.”

Read more: Want to build robots and invent stuff? Here’s where to start

The pro-AI side narrowly won, gaining 51.22 per cent of the audience vote.

Project Debater argued with humans for the first time last year , and in February this year lost in a one-on-one against champion debater Harish Natarajan, who also spoke at Cambridge as the third speaker for the team arguing in favour of AI.

IBM has plans to use the speech-by-crowd AI as a tool for collecting feedback from large numbers of people. For instance, it could be used by governments seeking public opinions about policies or by companies wanting input from employees, said IBM engineer Noam Slonim.

“This technology can help to establish an interesting and effective communication channel between the decision maker and the people that are going to be impacted by the decision,” he said.

  • artificial intelligence /

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

More from New Scientist

Explore the latest news, articles and features

Designed robotic tunafish featuring the morphing (first) dorsal fin https://arxiv.org/abs/2407.18843v1

Robo-tuna reveals how foldable fins help the speedy fish manoeuvre

Subscriber-only

Will implants that meld minds with machines enhance human abilities?

Will implants that meld minds with machines enhance human abilities?

a speech about robots

Watch a robot peel a squash with human-like dexterity

a speech about robots

AI can predict tipping points for systems from forests to power grids

Popular articles.

Trending New Scientist articles

Optimus

Elon Musk’s Tesla is promising to sell a humanoid robot. It could be the first of many

a speech about robots

Professor of Collaborative Computing, University of Nottingham

a speech about robots

Professor of Embodied Intelligence, School of Computer Science, University of Nottingham

Disclosure statement

Steve Benford receives funding from UKRI, EPSRC, AHRC and the European Union.

Praminda Caleb-Solly is Professor of Embodied Intelligence at the University of Nottingham. She is also the Co-Founder and Director of Robotics for Good, a Community Interest Company. She receives funding from the UKRI, EPSRC, AHRC, NIHR and the European Union.. She is Co-Chair of the IEEE Robotics and Automation Society Technical Committee for Robot Ethics.

University of Nottingham provides funding as a founding partner of The Conversation UK.

View all partners

Elon Musk’s recent announcement on Twitter that “Tesla will have genuinely useful humanoid robots in low production for Tesla internal use next year” suggests that robots that have physical human-like characteristics and provide “genuinely useful” function might be with us soon.

However, despite decades of trying, useful humanoid robots have remained a fiction that never seems to quite catch up with reality. Are we finally on the crux of a breakthrough? It’s relevant to question whether we really need humanoid robots at all.

Tesla’s Optimus robot is just one of several emerging humanoid robots, joining the likes of Boston Dyanmic’s Atlas , Figure AI’s Figure 01 , Sanctuary AI’s Phoenix and many others. They usually take the form of a bipedal platform that is variously capable of walking and sometimes, jumping, along with other athletic feats. On top of this platform a pair of robot arms and hands may be mounted that are capable of manipulating objects with varying degrees of dexterity and tactility.

Behind the eyes lies artificial intelligence tailored to planning navigation, recognising objects and carrying out tasks with these objects. The most commonly envisaged uses for such robots are in factories , carrying out repetitious, dirty, dull and dangerous tasks, and working alongside humans, collaboratively, carrying a ladder together for example.

They are also proposed for work in service industry roles , perhaps replacing the current generation of more utilitarian “meet and greet” and “tour guide” service robots. They could possibly be used in social care, where there have been attempts to lift and move humans, like the Riken Robear (admittedly this was more bear than humanoid ), and to deliver personal care and therapy.

There is also a more established and growing market in humanoid sex robots. Interestingly, while many people recognise the moral and ethical issues related to these, the use of humanoid robots in other areas seems to attract less controversy. It is, however, proving challenging to deliver humanoid robots in practice. Why should this be so?

There are numerous engineering challenges, such as achieving flexible bipedal locomotion on different terrain. It took humans about four million years to achieve this, so where we are now with humanoid robots is pretty impressive. But humans learn to combine a complex set of sensing capabilities to achieve this feat.

Similarly, achieving the dexterous manipulation of objects, which come in all shapes, sizes, weights, levels of fragility, is proving stubborn with robots. There has been significant progress, though, such as the dexterous hands from UK company Shadow Robot .

Compared to the human body that is covered in a soft and flexible skin that continuously senses and adapts to the world, robots’ tactile capabilities are limited to only a few points of contact such as finger tips.

Moving beyond automating specific tasks on factory assembly lines to improvising general tasks in a dynamic world demands greater progress in artificial intelligence as well as sensing and mechanical capabilities. Finally, if you are going to make a robot look human, then there is an expectation that it would also need to communicate with us like a human, perhaps even respond emotionally .

However, this is where things can get really tricky, because if our brains, which have evolved to recognise non-verbal elements of communication, don’t perceive all the micro-expressions that are interpreted at a subconscious level, the humanoid robot can come across as positively creepy.

These are just a few of the major research challenges that are already taxing communities of researchers in robotics and human-robot interaction across the globe. There’s also the additional constraint of deploying humanoid robots in our ever-changing noisy real world, with rain, dust and heat. These are very different conditions to the ones they’re tested in. So shouldn’t we focus on building systems that are more robust and won’t succumb to the same pitfalls that humans do?

Recreating ourselves

This brings us to the question of why Musk and many others are focused on humanoid robots. Must our robotic companions look like us? One argument is that we have gradually adapted our world to suit the human body. For example, our buildings and cities are largely constructed to accommodate our physical form. So an obvious choice is for robots to assume this form as well.

It must be said, though, that our built environments and tools often assume a certain level of strength, dexterity and sensory ability which disadvantages a vast number of people, including those who are disabled. So would the rise of stronger metal machines among us, further perpetuate this divide?

Perhaps we should see robots as being part of the world that we need to create which better accommodates the diversity of human bodies. We could put more effort into integrating robotics technologies into our buildings, furniture, tools and vehicles, making them smarter and more adaptable, so that they become more accessible for everyone.

It is striking how the current generation of limited robot forms fails to reflect the diversity of human bodies. Perhaps our apparent obsession with humanoid robots has other, deeper roots. The god-like desire to create versions of ourselves is a fantasy played out time and time again in dystopian science fiction, from which the tech industry’s readily appropriates ideas.

Or perhaps, humanoid robots are a “Moon shot”, a vision that we can all understand but is incredibly difficult to achieve. In short, we may not be entirely sure why we want to go there, but impressive engineering innovations are likely to emerge from just trying.

  • Humanoid robots
  • Give me perspective

a speech about robots

Microsoft and Citrix Technical Consultant

a speech about robots

Casual Facilitator: GERRIC Student Programs - Arts, Design and Architecture

a speech about robots

Senior Lecturer, Digital Advertising

a speech about robots

Manager, Centre Policy and Translation

a speech about robots

Newsletter and Deputy Social Media Producer

111 Robots Essay Topic Ideas & Examples

🏆 best robots topic ideas & essay examples, 👍 good essay topics on robots, ⭐ simple & easy robots essay titles, ❓ questions about robots.

  • Robots and Artificial Intelligence One the one hand, with artificial intelligence and fully autonomous robots, organizations will be able to optimize their spending and increase the speed of development and production of their commodities.
  • Discussion: Will Robots Replace Us? The world is moving forward, space and the ocean’s depths, and the peculiarities of the brain’s structure and the human body are being studied.
  • Robots’ Impact and Human Employment Opportunities Many of the costs of complying with the isolation rules, the costs associated with the spread of the disease, can actually be offset by replacing the workforce with robots.
  • Robots: The Use in Everyday Tasks The recent advancements in robotics and artificial intelligence have the potential to automate a wide range of human activities and to dramatically reshape the way people live and work in the coming decades.
  • Characteristics of Robotics What concerns the elaboration of an obstacle course in a “real-world” simulation, it is essential to ensure the presence of several procedure testing steps that will determine the functionality of a robot. What concerns the […]
  • Use of Robots in Computer Science Currently, the most significant development in the field of computer science is the inclusion of robots as teaching tools. The use of robots in teaching computer science has significantly helped to endow students with valuable […]
  • Robot Revolution in the Contemporary Society The lack of human resources in the middle of the 20th century and the development of industrial technologies led to the appearance of robots.
  • Visions of the Future in the Film I, Robot Even though some of the aspects of the filmmaker’s vision of future are possible, and very likely to become reality, the essence of the film appears highly unrealistic.
  • The Wireless Robotic Car: Design Project In this prototype, the task is to design a robotic car that can be controlled by a computer using wireless communication technology.
  • Autonomous Robots Since they are self sufficient, the autonomous robots have the capacity to work in the absence of human beings. In the future, humanoid robots might have the intelligence and emotions similar to those of human […]
  • The Place of Humanity in the Robotic Future The developers are trying to implement the brain, the human mind, in a digital environment. Paying attention to mechanical machines, commonly called “robots”, can be seen that they are created in the image and likeness […]
  • The Dyson Robotic Vacuum: Target Group and Marketing Plan Thus, the target audience of Dyson in Ontario is practical and prudent people who, when buying equipment, pay attention primarily to the prestige of the brand, the quality, and the durability of the purchased goods.
  • Aliens Concept in “I, Robot” by Alex Proyas: Film Analysis The purpose of this paper is to analyze the concept of aliens and its implications in the movie I, Robot. It is possible to state that modern advancements are the reflection of something different from […]
  • Isaac Asimov’s “Robot Dreams” and Alex Proyas’ “I, Robot” Driving to work involves the use of evolving technology as every car made today includes varying degrees of computerized information systems that inform the vehicle of important information everything from the need for an oil […]
  • 3D Robotics Disrupts the Aviation Industry 3D Robotics describe their business model as perceiving open hardware, drones, and the future of robotics as the part of the community and the company.
  • STEM (Science), Robots, Codes, Maker’s Space Overview Students’ interest in STEM, Robotics, Coding, and Engineering education and professions has been shown to be stimulated by early exposure to STEM knowledge.
  • Using Robots in the Medical Industry Third, the robot surgery further has been observed to increase comfort on the part of the patient as the surgery proceeds, and this results from ergonomic position that the robot assumes as the operation proceeds.
  • Robot Making: Materials for Building and Economic Factor As the science is progressing in recent times, we can be sure that it is a matter of time when we will get some economical alternatives of the materials that are needed to make a […]
  • Robots as a Factor in Unemployment Patterns One of the prevailing arguments in regards to this problem is that the advent of the robot technology is contributing towards a high rate of unemployment.
  • Spot Mini Robot by Boston Dynamics While the bigger robots by Boston Dynamics are designed to operate in extreme conditions, Spot Mini is a household robot, which makes it marketable to a wider community and, therefore, profitable.
  • Is the Robotics Development Helpful or Harmful? Robots remain the best option, as they will connect the children with the happenings in the school. They will dress the robot with their favorite clothes, communicate with the teacher using the robot, and swivel […]
  • Robotic Pharmacy System Implementation Citing some of the key benefits of the robotic pharmacy system, one of the most important is that it reduces the need for technical labor significantly.
  • Robotics and Artificial Intelligence in Organizations Otherwise, cognitively complex tasks and those demanding emotional intelligence will be performed by humans, with the support of robotics and AI. Therefore, this study speaks of the importance of employee trust in AI and organization.
  • Robotics in Construction: Automated and Semi-Automated Devices The robot is fitted with ultrasonic sensors that aid in positioning of the water jet in inclined areas and also the sensors determine the distance of concrete removal.
  • Will Robots Ever Replace Humans? It is quite peculiar that Bolonkin uses negation in order to stir the audience’s delight; more impressively, the specified approach works the pathos is concealed not in the description of the possibilities, but the compliment […]
  • Ways that Robotics Can Transform Our Daily Lives Robots will help to increase the labor force in the country in the future. Robots will be used to increase the productivity of human labor within the government sector and help in speeding up the […]
  • Exploring the Capabilities and Potential of Soft Robotics One of the critical advantages of soft robots is their ability to deform and adapt to their surroundings, making them ideal for tasks that require a high degree of flexibility and expertise.
  • Mobile Robots: Impact on Supply Chain Management According to the article, some of the advantages of using an RSC include the ability to dump reusable components and emissions during transit, and presence of collection, recovery, recycling, dismantling, and re-manufacturing facilities.
  • Drawing 3D Objects With Use of Robotic Arm The hot end of the printer melts the material and embeds it onto the surface onto the intended surface. The research also utilized the Arduino development board to interface the programs written and the physical […]
  • Robotic Process Automation Implementation Robotics in the tax system is a highly rational, reasonable, and beneficial idea that will help improve the service and make any process more accessible.
  • The Hybrid Robot Vacuum Cleaners The EUFY series of hybrid vacuum cleaners is one of the most popular choices in the market, and the company offers products in various pricing ranges. In the context of hybrid robot vacuum cleaners, market […]
  • Robotics and Related Social & Political Problems The combination of engineering and computer science has aided people in developing the field of robotics. The social impact of robotics lies in the problems that robots are designed to solve.
  • The Invento Robotics Products Analysis The 5 C’s of brand management has grown in popularity since it thoroughly evaluates all the important aspects of a company and allows for approach adjustments depending on what is and is not effective.
  • Hyper Evolution: The Rise of the Robots From the video, the robots look like real human beings, and they have been capacitated to act in a human way in what is known as machine learning technology powered by artificial intelligence. Hyper evolution […]
  • Amazon’s AI-Powered Home Robots The objective of the present plan is to provide a comprehensive analysis and evaluation of the introduction of AI-powered home robots as Amazon’s next disruptive customer product.
  • Robots on the Battlefield: Benefits vs. Constraints The principal obstacle to the introduction of robots on the battlefield is related to the impossibility of operating in the current environment.
  • Robotic Snowblower’s Segmentation, Targeting, and Positioning Strategy For success, a business needs to conduct a structured analysis of the market and competitors, segment consumers into narrow groups, assess the market’s attractiveness, and correctly position the brand.
  • Healthcare Robots: Entering the Era of a Technological Breakthrough However, using robots as medical doctors’ assistants has been only a figment of the most daring dreams until recently.
  • “A Robot Can Be Warehouse Worker’s Pal” by Jennifer Smith Employees working alongside the robots are guided adequately. This method makes it possible for companies to achieve their objectives in a timely manner.
  • Boston Dynamics’ Spot Robot Dog Spot is a four-legged robot that evolved from SpotMini (the initial version) that offers multiple capabilities of operation, including climbing, jumping, walking.
  • Artificial Intelligence in “I, Robot” by Alex Proyas To begin with, AI is defined by Nilsson as a field of computer science that attempts to enhance the level of intelligence of computer systems.
  • Disinfecting Robots: Care Ethics, and Design Thus, the utilization of this technology may be expected to reduce the incidence rate of HAIs. However, it is essential to consider the cost of this technology and reimbursement as they may be key factors […]
  • Robot Interaction Language (ROILA) and Robot Creativity The difference of ROILA from other languages for computing is that it should be simple for both machines and humans to understand.
  • The Personal and Servicing Robotic Market For the product to receive a successful launch, the focus will be placed on the target market and not the product features.
  • Process Description of a Rescue Robot Roboticists in the physical design of rescue robots ensure that the robots can traverse places that are physically unreachable to human rescuers and additionally equip them with a variety of distributed technology that enable them […]
  • The Tactical Throwable Robot The main technical characteristics of the machine are given below in the table offered by Czupryniak Rafal and Trojnazki Maziej in their article “Throwable tactical robot description of construction and performed tests”.
  • Wireless Robotic Car: Servo Motors and DC Motors This section focuses on the review of literature on servo motors and DC motors, in general as well as in the context of the current research project.
  • Autonomous Mobile Robot: GPS and Compass The other realization is that in most instances the challenges presented in the motion of the appendages of a particular robot are not only limited to the number of joints but can significantly exceed the […]
  • Whats Mean Robotics Welding Epping and Zhang define robotic welding as the utilization of programmable systems and tools that mechanize and automate the way welding is done.
  • Are Robots About to Enter the Healthcare Workforce? Many new technologies must first overcome several obstacles in order to become a part of the service environment, and robots are no exception.
  • The Influence of Robots and AI on Work Relationships In the early 20th century, Taylor’s work focused on production management and labor efficiency, which led to the attention of managers to the problems of selection, the motivation of employees, and their training.
  • Robots in Today’s Society: Artificial Intelligence The most important is the automation of the repeating process, to liberate human power, and avoid mistakes and delays in the processes.
  • Intelligent Transportation Systems: A Robot Project The construction of the robot involved the use of sensors and microchips, accessories also used in ITS technology. The role of the sensors in the robot was to detect obstacles and red light on the […]
  • I, Robot and the Effects of Technology The judgment call is generally made on the quality of life of the humans, with little to no regard for the lifestyle and options available to the robots who have achieved a higher level of […]
  • The Use of Robotics in the Operating Room The da Vinci surgical system is the first and one of the famous Robotics surgical systems used in the operating room.
  • The Connection Between Science and Technology: The Robotic Fish by Professor HU Furthermore, we discuss the other effects of science in technology and some of the recent technological developments in the rest of the world.
  • Knowledge of Saudi Nurse Managers Towards Robots The main objective of this study is to investigate the attitudes and knowledge of Saudi nurse managers towards the adoption of robotics for remote monitoring and management of elderly patient with chronic illness in an […]
  • Robotics. “Humans Need Not Apply” Video Mechanical muscles are more strong and reliable than humans, and the replacement of people by mechanisms in physical work allows society to specialize in intellectual work, develop economics and raise the standards of living.
  • Questionable Future of Robotics In this case, the lecture, which was focusing on the flow of robotics’ development, influenced my perception about the future, robotics’ impact on our lives, and the ability of robots to destroy the humanity.
  • Technology: Will Robots Ever Replace Humans? According to the author, one’s intelligence is not being solely concerned with the processing of data in the algorithmic manner, as it happened to be the case with AI it reflects the varying ability of […]
  • Double Robotics Website’s Tracking Strategy The goals of the Doublerobotics.com website are to familiarize audiences with the telepresence industry and to convince both corporate and individual potential customers to purchase a robot.
  • Robot-Assisted Rehabilitation: Article Critique The information about the groups of participants was available to clinicians and study personnel since the only post-stroke individual in the sample needed special procedures to participate.
  • Robotics in Construction Management: Impacts and Barriers The assessment of the economic feasibility of the robotization of individual construction processes is based on cost analysis and the calculation of payback.
  • Rights of ‘Feeling’ Robots and Humans Many futurists believe strongly that new laws will be needed to tame the behaviors and actions of robots. That being the case, autonomous robots might take advantage of their rights to control human beings.
  • Electronic or Robotic Companions: Business Model The device the usage of which will help to destroy the language bar. The speech of any speaker will be translated and presented to the owner of the device in his/her native language.
  • Robotics’ Sociopolitical and Economic Implications The foremost benefits of Robotics for individuals can be formulated as follows: The continual development/implementation of the Robotics-related technologies will increase the chances of self-actualization, on the part of the potentially affected individuals.
  • Stihl Company and Its Robotics Automation involves the use of robots in the production process. The company’s productivity has come as a result of the automation production practices and its presence across the globe.
  • Welcome Robotic for Abu Dhabi Women College In the year 2009, the college opened a second banch in the city of khalifa to cater for the students who encounter problems relocating to the capital city.
  • Fiat Company: Deployment of Robotics in Manufacturing The technology also enhanced the reduction of production costs by reducing the number of working days without effecting the production and the performance of the company at its peak.
  • Projects “Cyborg” and “New Electrical Apparatus” in Robotics In fact, although Project Cyborg included some medical expertise, the purpose is significantly similar to the project by Nicholson and Carlisle largely because a medical achievement is not one of their aims.
  • Meteorite or Puck Hunt: Autonomous Mobile Robot The Development of the Design Being the first time that we are taking part in this type of competition, we decide to work out a plan that would help us develop the autonomous mobile robot […]
  • Marketing the Wireless Robotic Car By sending the robotic car to a chemical hazard, it is possible to determine the extent of spillage of a liquid or a solid pollutant.
  • The Use of Robots in Warfare The military advancement in the use of robots in warfare will at long last essentially drastically reduce the role of human beings in war. The increased use of robots in the battlefield needs countries to […]
  • A Mobile Robotic Project in the Ohio State University Medical Center In order for the project to be successful there must be a one-to-one contact between those implementing the project and the staff at the hospital.
  • Autonomous Controller Robotics: The Future of Robots The middle level is the Coordination level which interfaces the actions of the top and lower level s in the architecture.
  • How Will Autonomous Robots Change Military Tactics?
  • Will Romantic Relationships Be Formed With Robots?
  • What Were the First Industrial Robots in America Used?
  • Will Robots and Humanoids Take Over the World?
  • Are Robots Beneficial for the Society?
  • Will Robots Automate Your Job Away?
  • Why Not Use Robots to Stabilize Stock Markets?
  • Will Robots Change Our Lives in the Future?
  • How Can Robots Effect Children’s Development?
  • Will Robots Create Economic Utopia?
  • Why Robots Are Start Over the World With Breakthrough Technology?
  • Will Robots Live With Humans in Harmony?
  • Can Humanoid Service Robots Perform Better Than Service Employees?
  • How Can Robots Be Used to Help Students?
  • Will Robots One Day Rule the World?
  • Why Should Robots Not Be Pursued?
  • How Do Robots Impact Careers in the Medical Field?
  • Why Will Robots Always Need Us?
  • Are Robots Taking Control of Human Tasks?
  • How Can Robots Have Human-Like Intelligence?
  • Can Service Robots Hamper Customer Anger and Aggression After a Service Failure?
  • Are Robots the Solution to Equality in the Job Interview Process?
  • How Can Robots Replace 60% of Jobs?
  • Are Sex Robots the Next Big Sexual Revolution?
  • How Can Robots Solve the Problem of Aging Population?
  • Are Surgical Robots the Future of Medicine?
  • How Can Robots Work More Efficient Than Humans?
  • Should Robots Intelligence Becoming Smarter Than Us and Make?
  • What Are Robots and How Are They Being Used Nowadays?
  • Are Robots and Animals More or Less Similar to One Another Than Robots and Humans?
  • LEGO Paper Topics
  • Aerospace Research Topics
  • Blade Runner Paper Topics
  • Artificial Intelligence Questions
  • Electric Vehicle Paper Topics
  • NASA Topics
  • Research and Development Essay Topics
  • Scientist Paper Topics
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2024, February 29). 111 Robots Essay Topic Ideas & Examples. https://ivypanda.com/essays/topic/robots-essay-topics/

"111 Robots Essay Topic Ideas & Examples." IvyPanda , 29 Feb. 2024, ivypanda.com/essays/topic/robots-essay-topics/.

IvyPanda . (2024) '111 Robots Essay Topic Ideas & Examples'. 29 February.

IvyPanda . 2024. "111 Robots Essay Topic Ideas & Examples." February 29, 2024. https://ivypanda.com/essays/topic/robots-essay-topics/.

1. IvyPanda . "111 Robots Essay Topic Ideas & Examples." February 29, 2024. https://ivypanda.com/essays/topic/robots-essay-topics/.

Bibliography

IvyPanda . "111 Robots Essay Topic Ideas & Examples." February 29, 2024. https://ivypanda.com/essays/topic/robots-essay-topics/.

English Summary

2 Minute Speech On Robots In English

Good morning to everyone in this room. I would like to thank the principal, the teachers, and my dear friends for allowing me to speak to you today about robots. Robots are machines with characteristics and behaviors that resemble those of humans. Following their programming, they may carry out duties. Robots have been significantly reducing human workloads over the past ten years or more, notably in the industrial sector.

Also, this will make them less effective than they were before they started working. As a result, individuals become burned out and lack the desire or excitement to continue working. Robots are especially useful in this situation since they make humans’ lives simpler than before.

Humans can now complete difficult or time-consuming activities with ease thanks to robotics and artificial intelligence, and this technology is only likely to advance in the future. Thank you. 

Related Posts:

For IEEE Members

Ieee spectrum, follow ieee spectrum, support ieee spectrum, enjoy more free content and benefits by creating an account, saving articles to read later requires an ieee spectrum account, the institute content is only available for members, downloading full pdf issues is exclusive for ieee members, downloading this e-book is exclusive for ieee members, access to spectrum 's digital edition is exclusive for ieee members, following topics is a feature exclusive for ieee members, adding your response to an article requires an ieee spectrum account, create an account to access more content and features on ieee spectrum , including the ability to save articles to read later, download spectrum collections, and participate in conversations with readers and editors. for more exclusive content and features, consider joining ieee ., join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of spectrum’s articles, archives, pdf downloads, and other benefits. learn more about ieee →, join the world’s largest professional organization devoted to engineering and applied sciences and get access to this e-book plus all of ieee spectrum’s articles, archives, pdf downloads, and other benefits. learn more about ieee →, access thousands of articles — completely free, create an account and get exclusive content and features: save articles, download collections, and talk to tech insiders — all free for full access and benefits, join ieee as a paying member., video friday: silly robot dog jump, your weekly selection of awesome robot videos.

Evan Ackerman is IEEE Spectrum’s robotics editor.

Stop that, it's too silly.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40 : 23–26 September 2024, ROTTERDAM, NETHERLANDS

Iros 2024 : 14–18 october 2024, abu dhabi, uae, icsr 2024 : 23–26 october 2024, odense, denmark, cybathlon 2024 : 25–27 october 2024, zurich.

Enjoy today’s videos!

The title of this video is “Silly Robot Dog Jump” and that’s probably more than you need to know.

[ Deep Robotics ]

It’ll be great when robots are reliably autonomous, but until they get there, collaborative capabilities are a must.

[ Robust AI ]

I am so INCREDIBLY EXCITED for this.

[ IIT Instituto Italiano di Tecnologia ]

In this 3 minutes long one-take video, the LimX Dynamics CL-1 takes on the challenge of continuous heavy objects loading among shelves in a simulated warehouse, showcasing the advantages of the general-purpose form factor of humanoid robots .

[ LimX Dynamics ]

Birds, bats and many insects can tuck their wings against their bodies when at rest and deploy them to power flight. Whereas birds and bats use well-developed pectoral and wing muscles, how insects control their wing deployment and retraction remains unclear because this varies among insect species. Here we demonstrate that rhinoceros beetles can effortlessly deploy their hindwings without necessitating muscular activity. We validated the hypothesis using a flapping microrobot that passively deployed its wings for stable, controlled flight and retracted them neatly upon landing, demonstrating a simple, yet effective, approach to the design of insect-like flying micromachines.
Agility Robotics’ CTO, Pras Velagapudi, talks about data collection, and specifically about the different kinds we collect from our real-world robot deployments and generally what that data is used for.

[ Agility Robotics ]

Robots that try really hard but are bad at things are utterly charming.

[ University of Tokyo JSK Lab ]

The DARPA Triage Challenge unsurprisingly has a bunch of robots in it.

The Cobalt security robot has been around for a while, but I have to say, the design really holds up—it’s a good looking robot.

[ Cobalt AI ]

All robots that enter elevators should be programmed to gently sway back and forth to the elevator music. Even if there’s no elevator music.

[ Somatic ]

ABB Robotics and the Texas Children’s Hospital have developed a groundbreaking lab automation solution using ABB’s YuMi® cobot to transfer fruit flies (Drosophila melanogaster) used in the study for developing new drugs for neurological conditions such as Alzheimer’s, Huntington’s and Parkinson’s.
Extend Robotics are building embodied AI enabling highly flexible automation for real-world physical tasks. The system features intuitive immersive interface enabling tele-operation, supervision and training AI models .

[ Extend Robotics ]

The recorded livestream of RSS 2024 is now online, in case you missed anything.

[ RSS 2024 ]

  • Video Friday: Dog-E ›
  • Benchmarking Robots with Dog-Inspired Barkour ›
  • Video: Turns out the Boston Dynamics robot dog can jump rope now ›
  • ROBOTS: Your Guide to the World of Robotics ›

Evan Ackerman is a senior editor at IEEE Spectrum . Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.

David Campbell

Robot dog jump...just dont give it AI or itll start stealing food from the dinner table!

Nasir Ahmed: An Unsung Hero of Digital Media

Taiwan reboots its solar-power fishponds, electrostatic motors reach the macro scale, related stories, video friday: the secrets of shadow robot’s new hand, figure 02 robot is a sleeker, smarter humanoid, rodney brooks’s three laws of robotics.

Figure’s new humanoid robot leverages OpenAI for natural speech conversations

Figure 02 robot

Figure has unveiled its latest humanoid robot, the Figure 02. The system is — as its name helpfully suggests — the successor to the Figure 01 robot unveiled in 2023 . An initial teaser video is similar to those we’ve seen from other humanoids, echoing consumer electronics product videos, rather than a raw demo of the robot in action.

Another video released Tuesday showcases the robot’s slow, bent-leg gait across the floor of what looks to be the demo area constructed in the middle of Figure’s offices. Another two robots appear in the background, carting totes — the biggest out-of-the-box application for most of these humanoids.

The most notable addition this time out arrives by way of a longstanding partnership with OpenAI, which helped Figure raise a $675 million Series B back in February , valuing the South Bay firm at $2.6 billion.

The mainstream explosion of neural networks has been enticing for the robotics industry at large, but humanoid developers have taken a particular interest in the technology. One of the form factor’s key selling points is its ability to effectively slot alongside human co-workers on a factory floor — once the proper safety measures are in place, of course. Figure 02 is outfitted with speakers and microphones to speak and listen to people at work.

Models like ChatGPT and Google Gemini have been prized for their natural language capabilities, ushering in a new area of smart assistants and chatbots. Outfitting these systems with such capabilities is a no-brainer: Doing so helps humans instruct the robots, while at the same time adding a level of transparency to what the robot is doing at any given time.

Communication like this is doubly important when dealing with humanoid robots, as the systems are designed to wander freely without a safety cage. Despite their human-like design, it’s important not to lose sight of the fact that they’re still big, heavy and potentially dangerous pieces of moving metal. Combined with vision and proximity sensors, speech can be an important safety tool.

Figure certainly isn’t alone in this work. Late last year, Agility showcased the work it’s been doing to leverage generative AI for improved human-robot communication. The use of neural networks was a key focus for Google’s Everyday Robots team before it was shuttered. Elon Musk, meanwhile, is ostensibly in charge of both Grok AI and Optimus — two projects that will no doubt dovetail sooner rather than later.

For its part, OpenAI has hedged its bets a bit in the category. Prior to its Figure investment, the firm backed Norwegian firm 1X. Over the past year, however, Figure has become far buzzier in the industry. Its aforementioned Series B also included other top tech names like Microsoft, Amazon, Nvidia and Intel Capital.

Figure recently began pilots with BMW . In June, the company debuted a video showcasing an earlier, tethered version of the robot autonomously performing tasks on the floor, with the help of neural networks.

The company notes that the 02 robot has already paid a visit to the automaker’s Spartanburg, South Carolina, facility for training and data collection purposes. We’re still very much in the early stages of these partnerships. Agility, Apptronik and Sanctuary AI have announced similar pilots with carmakers. Working on Teslas has been a key focus for Optimus since before it was Optimus, and Boston Dynamics-owner Hyundai has its sights set on humanoids in its own factories.

Communication is one piece of what Figure is referring to as a “ground-up hardware and software redesign” between 01 and 02. The list also includes six RGB cameras, coupled with an onboard visual language model, improved CPU/GPU computing and improved hands, with 16 degrees of freedom.

Hands have been their own hot-button topic in the humanoid robot world. There are differing opinions regarding how closely designers should hew to their human counterparts.

There’s a lot to be said for the nimbleness and dexterity of our appendages, though human-inspired hands have been criticized for their delicacy and a perceived over engineering. Figure, for its part, has been dedicated to using humanlike hands as its system’s end effectors.

We don’t have a timeline for a wider Figure 02 rollout, though the company is hinting at a broader future outside the warehouse/factory floor. “Figure’s robot combines the dexterity of the human form with advanced AI to perform a wide range of tasks across commercial applications and, in the near future, the home,” the company writes.

More TechCrunch

Get the industry’s biggest tech news, techcrunch daily news.

Every weekday and Sunday, you can get the best of TechCrunch’s coverage.

Startups Weekly

Startups are the core of TechCrunch, so get our best coverage delivered weekly.

TechCrunch Fintech

The latest Fintech news and analysis, delivered every Tuesday.

TechCrunch Mobility

TechCrunch Mobility is your destination for transportation news and insight.

Demand for AI is driving data center water consumption sky high

The AI boom is fueling the demand for data centers and, in turn, driving up water consumption. (Water is used to cool the computing equipment inside data centers.) According to…

Demand for AI is driving data center water consumption sky high

The Waymo robotaxi honking problem has been resolved for real this time

The group honking was an unintended consequence of Waymo’s tech.

The Waymo robotaxi honking problem has been resolved for real this time

What margins? AI’s business model is changing fast, says Cohere founder

OpenAI and Anthropic spend billions of dollars a year training models like GPT-4 and Claude, but competitive price dumping is making the business around these platforms rather precarious. Aidan Gomez,…

What margins? AI’s business model is changing fast, says Cohere founder

TechCrunch Space: Spending less

Hello, and welcome back to TechCrunch Space. Did you hear? Bridgit Mendler will be joining me onstage at this year’s TechCrunch Disrupt to talk all things ground stations. She’s just…

Gemini Live could use some more rehearsals

What’s the point of chatting with a human-like bot if it’s an unreliable narrator — and has a colorless personality? That’s the question I’ve been turning over in my head…

Gemini Live could use some more rehearsals

Now a million people can watch you fumble Zoom’s screen-share settings at once

Zoom on Monday announced a new single-user webinar feature that caps out at 1 million attendees. The addition comes less than a month after the #WinWithBlackWomen fundraiser for Vice President…

Now a million people can watch you fumble Zoom’s screen-share settings at once

Could Trump’s AI-generated Taylor Swift endorsement be illegal?

On Sunday, former President Donald Trump posted a collection of memes on Truth Social — the platform owned by his media company — that make it seem like Taylor Swift…

Could Trump’s AI-generated Taylor Swift endorsement be illegal?

Swarmbotics founders grew ‘obsessed with robot swarms’ and now plan to bring them to the battlefield

Few truly autonomous systems are deployed on the battlefield, but one startup is looking to change that with robotic systems that use cooperative behavior to boost troops’ intelligence and tactical…

Swarmbotics founders grew ‘obsessed with robot swarms’ and now plan to bring them to the battlefield

Former a16z VC Balaji Srinivasan obtained a private island for his new longevity ‘technocapitalist’ school

Former a16z-investor Balaji Srinivasan has booked out an island in Singapore to create his own “Network School.”

Former a16z VC Balaji Srinivasan obtained a private island for his new longevity ‘technocapitalist’ school

FlightAware warns that some customers’ info has been ‘exposed,’ including Social Security numbers

The flight tracking company says the misconfiguration exposed customer names, addresses, and pilot’s data, as well as Social Security numbers.

FlightAware warns that some customers’ info has been ‘exposed,’ including Social Security numbers

A surprising number of ‘iPad Kids’ are on X, study finds

Over 30% of 7- to 9-year-olds have an X account, according to a new report.

A surprising number of ‘iPad Kids’ are on X, study finds

Apple Podcasts launches on the web

Apple Podcasts can now be streamed from the web. Apple announced on Monday that its Apple Podcasts app is now available on all major web browsers (Chrome, Edge, Firefox, and…

Apple Podcasts launches on the web

From a $2.5 million hyper car to a Spanish track-ready EV, here were the most interesting EVs at Monterey Car Week

Historic vehicles, flowing champagne and fashion have dominated the events at Monterey Car Week for decades now. But a change is afoot: EVs, tech-centric vehicles, startups and a heavy dose…

From a $2.5 million hyper car to a Spanish track-ready EV, here were the most interesting EVs at Monterey Car Week

5 days left to secure ticket savings for TechCrunch Disrupt 2024

The clock is ticking! You’ve got just 5 days left to lock in discounted tickets for TechCrunch Disrupt 2024. Save up to $600 on individual ticket types. This limited-time offer ends…

5 days left to secure ticket savings for TechCrunch Disrupt 2024

GM cuts 1,000 software jobs as it prioritizes quality and AI

General Motors is cutting around 1,000 software workers around the world in a bid to focus on more “high-priority” initiatives like improving its Super Cruise driver assistance system, the quality…

GM cuts 1,000 software jobs as it prioritizes quality and AI

Procreate takes a stand against generative AI, vows to never incorporate the tech into its products

Popular iPad design app Procreate is coming out against generative AI, and has vowed never to introduce generative AI features into its products. The company said on its website that…

Procreate takes a stand against generative AI, vows to never incorporate the tech into its products

Mike Lynch, recently acquitted in HP-Autonomy fraud case, is missing after yacht capsized off Sicily

Mike Lynch, the investor and high-profile founder of U.K. tech firm Autonomy, has been declared missing at sea after the yacht he was on, the Bayesian, capsized in a storm…

Mike Lynch, recently acquitted in HP-Autonomy fraud case, is missing after yacht capsized off Sicily

ElevenLabs’ text-to-speech app Reader is now available globally

ElevenLabs, which develops AI-powered tools to create and edit synthetic voices, is making its Reader app available globally with support for 32 languages.

ElevenLabs’ text-to-speech app Reader is now available globally

AMD to acquire infrastructure player ZT Systems for $4.9B to amp up its AI ecosystem play

AMD is acquiring ZT Systems, which provides compute design and infrastructure for AI, cloud and general purpose computing, for $4.9 billion.

AMD to acquire infrastructure player ZT Systems for $4.9B to amp up its AI ecosystem play

Amazon considers moving Amazon Pay into a standalone app in India

Amazon is considering shifting its payments offerings in India into a standalone app, three sources familiar with the matter told TechCrunch, as the e-commerce giant aims to boost usage of…

Amazon considers moving Amazon Pay into a standalone app in India

As CO 2 emissions from supply chains come into focus, this startup is aiming at farms

Root helps food and beverage companies collect primary data on their agricultural supply chains. 

As CO2 emissions from supply chains come into focus, this startup is aiming at farms

Waza comes out of stealth with $8M to power global trade for African businesses

In May, the African fintech processed up to $70 million in monthly payment volume.

Waza comes out of stealth with $8M to power global trade for African businesses

Digitally resurrecting actors is still a terrible idea

This post contains spoilers for the movie “Alien: Romulus” In the long-running “Alien” movie franchise, the Weyland-Yutani Corporation can’t seem to let go of a terrible idea: It keeps trying…

Digitally resurrecting actors is still a terrible idea

With the Polestar 3 now ‘weeks’ away, its CEO looks to make company ‘self-sustaining’

Thomas Ingenlath is having perhaps a little too much fun in his Polestar 3, silently rocketing away from stop signs and swinging through tightening bends, grinning like a man far…

With the Polestar 3 now ‘weeks’ away, its CEO looks to make company ‘self-sustaining’

South Korea’s AI textbook program faces skepticism from parents

Some parents have reservations about the South Korean government’s plans to bring tablets with AI-powered textbooks into classrooms, according to a report in Financial Times. The tablets are scheduled to…

South Korea’s AI textbook program faces skepticism from parents

Featured Article

How VC Pippa Lamb ended up on ‘Industry’ — one of the hottest shows on TV

Season 3 of “Industry” focuses on the fictional bank Pierpoint and blends the worlds — and drama — of tech, media, government and finance.

How VC Pippa Lamb ended up on ‘Industry’ — one of the hottest shows on TV

Selling a startup in an ‘acqui-hire’ is more lucrative than it seems, founders and VCs say

Selling under such circumstances is often not as poor of an outcome for founders and key staff as it initially seems. 

Selling a startup in an ‘acqui-hire’ is more lucrative than it seems, founders and VCs say

These fintech companies are hiring, despite a rough market in 2024

While the rapid pace of funding has slowed, many fintechs are continuing to see growth and expand their teams.

These  fintech companies are hiring, despite a rough market in 2024

Rippling’s Parker Conrad says founders should ‘go all the way to the ground’ to run their companies

This is just one area of leadership where Parker Conrad takes a contrarian approach. He also said he doesn’t believe in top-down management.

Rippling’s Parker Conrad says founders should ‘go all the way to the ground’ to run their companies

Nancy Pelosi criticizes California AI bill as ‘ill-informed’

Congresswoman Nancy Pelosi issued a statement late yesterday laying out her opposition to SB 1047, a California bill that seeks to regulate AI. “The view of many of us in…

Nancy Pelosi criticizes California AI bill as ‘ill-informed’

  • Monmouth County
  • Ocean County
  • NJ Politics

Invention by local high school girls could lead to changes in the way hospital staff communicate

HOLMDEL — Three local high school students have captured worldwide recognition for their invention: A robot that uses AI to read American Sign Language and play corresponding piano keys.

“We wanted to sort of bridge the gap between … people who use Sign Language on a day-to-day basis and people who speak spoken language,” said Aditi Gopalakrishnan, 16 of Piscataway. 

The musically inclined robot is only a preliminary phase of a bigger plan. The three girls want to pave the way for a future where deaf people can more effectively communicate with medical professionals without the need of a human translator.

"I volunteer in hospitals, so I've encountered patients who have language barriers," Gopalakrishnan said, noting that sometimes people who can translate are not always available.

Gopalakrishnan explained that their current model could work in the medical field as a fully-equipped, multi-language AI translator, allowing hospital staffers to communicate with people who are deaf or use Sign Language. 

Maya Baireddy, 15, of Holmdel, and Julia Chan, 15, of Bridgewater make up the other members of the JAM session team; the three initials of their first names combining to create the J, A and M.

RoboCup Junior is an international competition that encourages high-school-aged students to explore their interest in technology. The JAM team represented the United States.

On July 16, the three competed in the “onstage,” or performance, challenge which took place in Eindhoven, Netherlands.

For their performance, the USA team created a green monster with large googly eyes that would play keys on a keyboard that corresponded to Sign Language letters the girls formed with their hands. The girls used training data and AI tools to have the robot recognize the letters as they held up a hand to the camera on the back of the contraption.

Out of 24 teams and 21 countries participating, JAM session earned second place.

Gopalakrishnan, Baireddy and Chan attend Woodbridge Academy Magnet School, High Technology High School in Middletown, and Bridgewater Raritan High School, respectively.

They worked on the project at the Storming Robots robotics center in Branchburg, Somerset County, where their mentor recommended that they take part in the RoboCup Junior competition.

The three teens eventually won first place in the national competition in May, at the Wardlaw-Hartridge School in Edison. That punched their tickets as the only U.S. representative in the Netherlands.

Gopalakrishnan spearheaded the sign language detection algorithm; Baireddy worked on the frame, costume, and 3D printing the mechanisms that pressed the keys; and Chan worked on the motors, which allowed for the precise movement of the mechanisms that pressed the keys.

The team had been working on the project since last December, but never lost interest in robotics.

"Probably the opposite," said Chan.

Baireddy added, "I actually enjoyed it more after this," and the other girls agreed vehemently.

The girls emphasized that through the competitions they learned the importance of communication with each other, time management and scheduling, as well as the importance of having spare parts to help avoid possible mechanical failures.

"Spares are very, very very important," Chan.

They said that Murphy's Law -- the idea that anything that could go wrong will -- was a big part of the process. Now, they can laugh about it.

Parts broke and even the frame fell apart at one point in the national preparations. The girls used those failures to better prepare for the international competition.

They now want to turn their good showing into something that spreads. The trio intend to start a group that encourages young girls to find mentorship in robotics.

They noted that STEM fields are male-dominated, about 80 percent of competitors in the Junior and Major leagues of the RoboCup competitions were male.

"We've been thinking about starting a group that helps people, especially younger girls," said Gopalakrishnan. "We've all been at a point in our robotics journey where we wanted a mentor."

What can robots teach us about being human?

Robots reflect their creators: Us. Counterintuitive and insightful, these robots show us glimmers into what it means to be human.

a speech about robots

Robots with "soul"

a speech about robots

4 lessons from robots about being human

a speech about robots

Why we will rely on robots

a speech about robots

Robots that "show emotion"

What is ChatGPT? Here's everything you need to know about OpenAI's chatbot

  • ChatGPT is getting a futuristic human update. 
  • ChatGPT has attracted users at a feverish pace and spurred Big Tech to release other AI chatbots.
  • Here's how ChatGPT works — and what's coming next.

Insider Today

OpenAI has started rolling out an advanced voice mode for its blockbuster chatbot ChatGPT.

Sam Altman's company began rolling out the chatbot's new voice mode to a small group of ChatGPT Plus users in July. OpenAI said the new voice feature "offers more natural, real-time conversations, allows you to interrupt anytime, and senses and responds to your emotions."

The feature is part of OpenAI's wider GPT-4o launch, a new version of the bot that can hold conversations with users and has vision abilities. The chatbot's vision features are expected as a later release. 

The move is a big step for the future of AI-powered virtual assistants, which tech companies have been racing to develop.

Since its release in late 2022, hundreds of millions of people have experimented with the tool, which is already changing how the internet looks and feels to users.

Users have flocked to ChatGPT to improve their personal lives and boost productivity . Some workers have used the AI chatbot to develop code , write real estate listings , and create lesson plans, while others have made teaching the best ways to use ChatGPT a career all to itself.

ChatGPT offers dozens of plug-ins to ChatGPT Plus subscribers. An Expedia plug-in can help you book a trip, while one from OpenTable will nab you a dinner reservation. OpenAI has also launched Code Interpreter, a version of ChatGPT that can code and analyze data .

While the personal tone of conversations with an AI bot like ChatGPT can evoke the experience of chatting with a human, the technology that runs on large language model tools doesn't speak with sentience and doesn't "think" the way humans do. 

That means that even though ChatGPT can explain quantum physics or write a poem on command, a full AI takeover isn't exactly imminent , according to experts.

"There's a saying that an infinite number of monkeys will eventually give you Shakespeare," said Matthew Sag, a law professor at Emory University who studies copyright implications for training and using large language models like ChatGPT.

"There's a large number of monkeys here, giving you things that are impressive — but there is intrinsically a difference between the way that humans produce language, and the way that large language models do it," he said. 

Chatbots like ChatGPT are powered by large amounts of data and computing techniques to make predictions to string words together in a meaningful way. They not only tap into a vast amount of vocabulary and information, but also understand words in context. This helps them mimic speech patterns while dispatching an encyclopedic knowledge. 

Other tech companies like Google and Meta have developed their own large language model tools, which use programs that take in human prompts and devise sophisticated responses.

Despite the AI's impressive capabilities, some have called out OpenAI's chatbot for spewing misinformation , stealing personal data for training purposes , and even encouraging students to cheat and plagiarize on their assignments. 

Some efforts to use chatbots for real-world services have proved troubling. In 2023, the mental health company Koko came under fire after its founder wrote about how the company used GPT-3 in an experiment to reply to users. 

Koko cofounder Rob Morris hastened to clarify on Twitter that users weren't speaking directly to a chatbot, but that AI was used to "help craft" responses. 

Read Insider's coverage on ChatGPT and some of the strange new ways that both people and companies are using chat bots: 

The tech world's reception to ChatGPT:

Microsoft is chill with employees using ChatGPT — just don't share 'sensitive data' with it.

Microsoft's investment into ChatGPT's creator may be the smartest $1 billion ever spent

ChatGPT and generative AI look like tech's next boom. They could be the next bubble.

The ChatGPT and generative-AI 'gold rush' has founders flocking to San Francisco's 'Cerebral Valley'

Insider's experiments: 

I asked ChatGPT to do my work and write an Insider article for me. It quickly generated an alarmingly convincing article filled with misinformation.

I asked ChatGPT and a human matchmaker to redo my Hinge and Bumble profiles. They helped show me what works.

I asked ChatGPT to reply to my Hinge matches. No one responded.

I used ChatGPT to write a resignation letter. A lawyer said it made one crucial error that could have invalidated the whole thing .

Read ChatGPT's 'insulting' and 'garbage' 'Succession' finale script

An Iowa school district asked ChatGPT if a list of books contains sex scenes, and banned them if it said yes. We put the system to the test and found a bunch of problems.

Developments in detecting ChatGPT: 

Teachers rejoice! ChatGPT creators have released a tool to help detect AI-generated writing

A Princeton student built an app which can detect if ChatGPT wrote an essay to combat AI-based plagiarism

Professors want to 'ChatGPT-proof' assignments, and are returning to paper exams and requesting editing history to curb AI cheating

Related stories

ChatGPT in society: 

BuzzFeed writers react with a mix of disappointment and excitement at news that AI-generated content is coming to the website

ChatGPT is testing a paid version — here's what that means for free users

A top UK private school is changing its approach to homework amid the rise of ChatGPT, as educators around the world adapt to AI

Princeton computer science professor says don't panic over 'bullshit generator' ChatGPT

DoNotPay's CEO says threat of 'jail for 6 months' means plan to debut AI 'robot lawyer' in courtroom is on ice

It might be possible to fight a traffic ticket with an AI 'robot lawyer' secretly feeding you lines to your AirPods, but it could go off the rails

Online mental health company uses ChatGPT to help respond to users in experiment — raising ethical concerns around healthcare and AI technology

What public figures think about ChatGPT and other AI tools:

What Elon Musk, Bill Gates, and 12 other business leaders think about AI tools like ChatGPT

Elon Musk was reportedly 'furious' at ChatGPT's popularity after he left the company behind it, OpenAI, years ago

CEO of ChatGPT maker responds to schools' plagiarism concerns: 'We adapted to calculators and changed what we tested in math class'

A theoretical physicist says AI is just a 'glorified tape recorder' and people's fears about it are overblown

'The most stunning demo I've ever seen in my life': ChatGPT impressed Bill Gates

Ashton Kutcher says your company will probably be 'out of business' if you're 'sleeping' on AI

ChatGPT's impact on jobs: 

AI systems like ChatGPT could impact 300 million full-time jobs worldwide, with administrative and legal roles some of the most at risk, Goldman Sachs report says

Jobs are now requiring experience with ChatGPT — and they'll pay as much as $800,000 a year for the skill

ChatGPT may be coming for our jobs. Here are the 10 roles that AI is most likely to replace.

AI is going to eliminate way more jobs than anyone realizes

It's not AI that is going to take your job, but someone who knows how to use AI might, economist says

4 careers where workers will have to change jobs by 2030 due to AI and shifts in how we shop, a McKinsey study says

Companies like Amazon, Netflix, and Meta are paying salaries as high as $900,000 to attract generative AI talent

How AI tools like ChatGPT are changing the workforce:

10 ways artificial intelligence is changing the workplace, from writing performance reviews to making the 4-day workweek possible

Managers who use AI will replace managers who don't, says an IBM exec

How ChatGPT is shaping industries: 

ChatGPT is coming for classrooms, hospitals, marketing departments, and everything else as the next great startup boom emerges

Marketing teams are using AI to generate content, boost SEO, and develop branding to help save time and money, study finds

AI is coming for Hollywood. 'It's amazing to see the sophistication of the images,' one of Christopher Nolan's VFX guy says.

AI is going to offer every student a personalized tutor, founder of Khan Academy says

A law firm was fined $5,000 after one of its lawyers used ChatGPT to write a court brief riddled with fake case references

How workers are using ChatGPT to boost productivity:  

CheatGPT: The hidden wave of employees using AI on the sly

I used ChatGPT to talk to my boss for a week and she didn't notice. Here are the other ways I use it daily to get work done.

I'm a high school math and science teacher who uses ChatGPT, and it's made my job much easier

Amazon employees are already using ChatGPT for software coding. They also found the AI chatbot can answer tricky AWS customer questions and write cloud training materials.

How 6 workers are using ChatGPT to make their jobs easier

I'm a freelance editor who's embraced working with AI content. Here's how I do it and what I charge.

How people are using ChatGPT to make money:

How ChatGPT and other AI tools are helping workers make more money

Here are 5 ways ChatGPT helps me make money and complete time-consuming tasks for my business

ChatGPT course instruction is the newest side hustle on the market. Meet the teachers making thousands from the lucrative gig.

People are using ChatGPT and other AI bots to work side hustles and earn thousands of dollars — check out these 8 freelancing gigs

A guy tried using ChatGPT to turn $100 into a business making 'as much money as possible.' Here are the first 4 steps the AI chatbot gave him

We used ChatGPT to build a 7-figure newsletter. Here's how it makes our jobs easier.

I use ChatGPT and it's like having a 24/7 personal assistant for $20 a month. Here are 5 ways it's helping me make more money.

A worker who uses AI for a $670 monthly side hustle says ChatGPT has 'cut her research time in half'

How companies are navigating ChatGPT: 

From Salesforce to Air India, here are the companies that are using ChatGPT

Amazon, Apple, and 12 other major companies that have restricted employees from using ChatGPT

A consultant used ChatGPT to free up time so she could focus on pitching clients. She landed $128,000 worth of new contracts in just 3 months.

Luminary, an AI-generated pop-up restaurant, just opened in Australia. Here's what's on the menu, from bioluminescent calamari to chocolate mousse.

A CEO is spending more than $2,000 a month on ChatGPT Plus accounts for all of his employees, and he says it's saving 'hours' of time

How people are using ChatGPT in their personal lives:

ChatGPT planned a family vacation to Costa Rica. A travel adviser found 3 glaring reasons why AI won't replace experts anytime soon.

A man who hated cardio asked ChatGPT to get him into running. Now, he's hooked — and he's lost 26 pounds.

A computer engineering student is using ChatGPT to overcome learning challenges linked to her dyslexia

How a coder used ChatGPT to find an apartment in Berlin in 2 weeks after struggling for months

Food blogger Nisha Vora tried ChatGPT to create a curry recipe. She says it's clear the instructions lacked a human touch — here's how.

Men are using AI to land more dates with better profiles and personalized messages, study finds

Lawsuits against OpenAI:

OpenAI could face a plagiarism lawsuit from The New York Times as tense negotiations threaten to boil over, report says

This is why comedian Sarah Silverman is suing OpenAI, the company behind ChatGPT

2 authors say OpenAI 'ingested' their books to train ChatGPT. Now they're suing, and a 'wave' of similar court cases may follow.

A lawsuit claims OpenAI stole 'massive amounts of personal data,' including medical records and information about children, to train ChatGPT

A radio host is suing OpenAI for defamation, alleging that ChatGPT created a false legal document that accused him of 'defrauding and embezzling funds'

Tips on how to write better ChatGPT prompts:

7 ways to use ChatGPT at work to boost your productivity, make your job easier, and save a ton of time

I'm an AI prompt engineer. Here are 3 ways I use ChatGPT to get the best results.

12 ways to get better at using ChatGPT: Comprehensive prompt guide

Here's 9 ways to turn ChatGPT Plus into your personal data analyst with the new Code Interpreter plug-in

OpenAI's ChatGPT can write impressive code. Here are the prompts you should use for the best results, experts say.

Watch: What is ChatGPT, and should we be afraid of AI chatbots?

a speech about robots

  • Main content

IMAGES

  1. Few lines on Robot

    a speech about robots

  2. I want an essay on ''Robots''.

    a speech about robots

  3. Robots in our life Free Essay Example

    a speech about robots

  4. Robotics Essay

    a speech about robots

  5. Essay on Can Robots Replace Humans?

    a speech about robots

  6. A Robot Essay In English

    a speech about robots

COMMENTS

  1. Ideas about Robots

    These TED Talks offer both whizzy demos and serious ideas on our evolving relationship with robots. Loading... Skip playlists. Video playlists about Robots. 17 talks. The Butterfly Effect: Talks from the TEDinArabic Summit. In March 2023, 17 speakers from across the world gathered in Doha for the inaugural TEDinArabic Summit. From climate ...

  2. How to live with robots

    These talks offer some whizzy demos and examine how robots are becoming an intimate part of our lives. Watch now. Add to list. 08:41. Ali Kashani. A friendly, autonomous robot that delivers your food. 8 minutes 41 seconds. 06:17. Lucy Farey-Jones. A fascinating time capsule of human feelings toward AI.

  3. AI and robotics: How will robots help us in the future?

    Marc Segura of ABB, a robotics firm started in 1988, shared real stories from warehouses across the globe in which robots are managing jobs that have high-accident rates or long-term health consequences for humans.With robots that are strong enough to lift one-ton cars with just one arm, and other robots that can build delicate computer chips (a task that can cause long-term vision impairments ...

  4. Have robots taken over the world?

    Robots are all around us. From automatic sliding doors to SMS text messages, we come across robotics every single day. Some jobs that used to be performed solely by humans are now being filled by robots. In Japan, the Henn na Hotel has robots that help with the check-in process, provide concierge services, and even bring luggage to guests' rooms.

  5. Artificial intelligence

    Artificial intelligence. Computers are being taught to learn, reason and recognize emotions. In these talks, look for insights -- as well as warnings. Watch now.

  6. The Future of Robotics: How Robots Will Transform Our Lives

    Motion Control and Navigation: Robots no longer need humans to guide them along paths and process flows. AI enables robots to analyze their environment and self-navigate. This capability even applies to the virtual world of software. AI helps robot software processes avoid flow bottlenecks or process exceptions.

  7. We ask the experts: will robots take over the world?

    The origins of robotics go back to the automata invented by ancient civilisations. The word robot entered our vocabulary only in 1920 with Czech writer Karel Čapek's play R.U.R (Rossum's Universal Robots). Over the past 20 years robots have been developed to work in settings that range from manufacturing industry to space.

  8. Robotics: What Are Robots? Robotics Definition & Uses.

    Robotics is the intersection of science, engineering and technology that produces machines, called robots, that replicate or substitute for human actions. Robots perform basic and repetitive tasks with greater efficiency and accuracy than humans, making them ideal for industries like manufacturing. However, the introduction of artificial ...

  9. Rise of the Robots--The Future of Artificial Intelligence

    Evidence that simple functions can be composed to produce the higher capabilities of nervous systems comes from programs that read, recognize speech, guide robot arms to assemble tight components ...

  10. 6 TED Talks About Robotics Everyone Should Watch

    Robotic automation has allowed us to meet these goals and will keep inspiring all industries to go farther. We've compiled a list of the 6 must-see TED Talks from thought leaders in robotics and automation and what we can learn from each. 1. Meet the Robot Designed Like an Octopus Tentacle.

  11. Don't Fear the Robots, and Other Lessons From a Study of the Digital

    Robots and A.I. are not about to deliver a jobless future. Technology has always replaced some jobs, created new ones and changed others. The question is whether things will be different this time ...

  12. Robotics

    Robotics is the interdisciplinary study and practice of the design, construction, operation, and use of robots. [1]Within mechanical engineering, robotics is the design and construction of the physical structures of robots, while in computer science, robotics focuses on robotic automation algorithms.Other disciplines contributing to robotics include electrical, control, software, information ...

  13. Helping robots practice skills independently to adapt to unfamiliar

    The robot, which has an arm attached to its back, completed manipulation tasks after practicing for a few hours. In one demonstration, the robot learned how to securely place a ball and ring on a slanted table in roughly three hours. In another, the algorithm guided the machine to improve at sweeping toys into a bin within about two hours.

  14. We ask the experts: Will robots take over the world?

    The origins of robotics go back to the automata invented by ancient civilisations. The word robot entered our vocabulary only in 1920 with Czech writer Karel Čapek's play R.U.R (Rossum's Universal Robots). Over the past 20 years robots have been developed to work in settings that range from manufacturing industry to space. At Cambridge University, robotics is a rapidly developing field ...

  15. Here's why robots could create more employment opportunities

    Contrary to popular fears about job losses, the World Economic Forum predicts that automation will result in a net increase of 58 million jobs. About two-thirds of the jobs transformed by automation will become higher-skilled, while the other third will be lower-skilled. We can learn from other historical transformations in the 1900s, including ...

  16. Forget Siri: Here's a New Way for Robots to Talk

    Siri (and Google Now, Cortana, and other voice-based assistants) brought big advances to speech recognition, natural language processing, and speech synthesis. But to make human-robot dialogues ...

  17. Robot debates humans about the dangers of artificial intelligence

    24 November 2019. Project Debater argued both for and against the benefits of artificial intelligence. IBM. An artificial intelligence has debated the dangers of AI - narrowly convincing ...

  18. A Future in Robotics. Speech delivered at the annual EBF ...

    Robot supporting people. A good example of a very ambitious goal was the Apollo space programme. In 1961 US President John F. Kennedy told the United States Congress that America had a new goal ...

  19. Elon Musk's Tesla is promising to sell a humanoid robot. It could be

    Tesla's Optimus robot is just one of several emerging humanoid robots, joining the likes of Boston Dyanmic's Atlas, Figure AI's Figure 01, Sanctuary AI's Phoenix and many others. They ...

  20. Spoken language interaction with robots: Recommendations for future

    To coordinate spoken language with a robot's physical gestures and motion, synthesizers must need to be able to output sync points and to support fine-grained timing control. Recommendation 14. Develop speech generators that support multimodal interaction. Robots are embodied and multimodal.

  21. 111 Robots Topic Ideas to Write about & Essay Samples

    The Wireless Robotic Car: Design Project. In this prototype, the task is to design a robotic car that can be controlled by a computer using wireless communication technology. Autonomous Robots. Since they are self sufficient, the autonomous robots have the capacity to work in the absence of human beings.

  22. 2 Minute Speech On Robots In English

    Robots are machines with characteristics and behaviors that resemble those of humans. Following their programming, they may carry out duties. Robots have been significantly reducing human workloads over the past ten years or more, notably in the industrial sector. Robots are typically used in the manufacturing sector.

  23. Robot Videos: Deep Robotics, Robust AI, and More

    A jumping quadruped from Deep Robotics (silly dog jump) and collaborative cars from Robust AI are featured in this edition of Video Friday, a collection of robotics videos showcasing the latest ...

  24. Speech On Robots

    Speech On Robots. Satisfactory Essays. 961 Words. 4 Pages. Open Document. "Robots Can Take Over the World!" "They're much smarter than we are" "I'd never have to clean again!". These statements are some that I have heard during the discussion of the topic about robots. Many individuals are fascinated by this.

  25. Tesla primed to sell AI-powered humanoid robots alongside its EVs ...

    Tesla's Optimus robot is just one of several emerging humanoid robots, joining the likes of Boston Dyanmic's Atlas, Figure AI's Figure 01, Sanctuary AI's Phoenix and many others. They usually take ...

  26. Figure's new humanoid robot leverages OpenAI for natural speech

    Figure has unveiled its latest humanoid robot, the Figure 02. The system is — as its name helpfully suggests — the successor to the Figure 01 robot unveiled in 2023. An initial teaser video is ...

  27. New Jersey teens invent robot that helps hearing-impaired communicate

    The musically inclined robot is only a preliminary phase of a bigger plan. The three girls want to pave the way for a future where deaf people can more effectively communicate with medical ...

  28. What can robots teach us about being human?

    Counterintuitive and insightful, these robots show us glimmers into what it means to be human. Watch now. Add to list. 17:25. Guy Hoffman. Robots with "soul" 17 minutes 25 seconds. 16:52. Ken Goldberg. 4 lessons from robots about being human. 16 minutes 52 seconds. 09:42. Rodney Brooks. Why we will rely on robots. 9 minutes 42 seconds.

  29. What is ChatGPT? Here's everything you need to know about ...

    This helps them mimic speech patterns while dispatching an encyclopedic knowledge. ... DoNotPay's CEO says threat of 'jail for 6 months' means plan to debut AI 'robot lawyer' in courtroom is on ice.

  30. British-supplied robo-dogs sent to battlefield in Ukraine

    The robot can move at speeds of 9mph for up to five hours, mounted with more than 7kg of ammunition, medical supplies or other items needed in hot spots on the battlefield.