From the 1st Industrial Revolution to Industry 4.0
The wonders of AI powered Robotics
The wonders of AI powered Robotics
Some of us, were there when, during the 80s, the launch of Walkman devices revolutionized the way the could experience their bus ride to work, jogging sessions, or the time they spent chilling on the couch, listening to their favorite songs whenever and wherever they wanted.
In 2020, we are testing driverless cars, which will soon(?) bring us to our destinations and back with minimum efforts. You can read our article on self-driving vehicles, in which we discuss some of the main problematics that separate us from the automated cars era.
What we are experiencing is the so-called 4th industrial revolution, or Industry 4.0. Let’s summarize the main steps of our technological evolution. Each of them had a profound impact on society and global economy.
The first industrial revolution (England 1780-1830) was powered by the growth of the textile industry, and led by the invention made by James Watt, a powerful and efficient steam engine. A machine that can transfer energy to other machines, allowing humanity to create the first locomotive to move people and goods, and leading the way to the industrial mechanization.
The second industrial revolution (1870-1913) was characterized by the mass production of steel, the electrification process with consequent reorganization of the manufacturing process with the assembly line, and the invention of the automobile.
We arrive to the third industrial revolution (1970-2000), the one who made it possible for us to communicate through blog posts, for example. Informatics and electronics broke into humans’ lives, and computers became a must-have object for all individuals or families. Many tasks before performed by humans became fully or partially automated. Internet boosted the globalization process, disrupting the communications and information universe.
2000-today, and more to see: industry 4.0, the time of robots and connected things. IoT, AI applied to computer vision and enhanced by sophisticated machine learning algorithms had a huge impact on every industry, from manufacturing and entertainment, to personal services, and medical care. All the objects and their sensors can be connected to a network, exchange information and be able to use them to make decisions and perform their specific task more efficiently, thanks to the data available and the power of artificial intelligence. Data can now be distributed in big areas to provide useful information to people, as traffic updates. These technologies are able to enrich the toolset of physicians, helping people to get faster and more accurate diagnosis, ad-hoc therapeutic approach and better medicines
Robot is not an English word. Its origins go back to 1920, when the Czech writer Carel Capek, came up with the word robota to describe, in his novel RUR, androids who fight against their masters to win freedom from slavery.
The very first robots were designed and built during WWII in American labs of Argonne and Oak Ridge National Laboratories. The first robot for teleoperations was introduced in France instead, as in 1951 CEA (Centre Energie Atomique) when Raymond Goertz invented a remotely controlled (teleoperated) arm to manipulate radioactive material with no dangers for the assigned employee.
A Robot can be defined as a machine capable of some kind of perception, computing, or “thinking”, and taking action.
A drone, for example, can rely on sensors like a gyroscope, accelerometer and many times a GPS system to
determine it s absolute position in the space. All the information received from the sensors is then elaborated and used to, for example, stabilize the drone.
Robots are employed in a wide variety of industries now a days, as Da Vinci robot, that allows surgeons to operate with much higher precision, raising the success rate of invasive surgeries. Companies are investing more and more to innovate their processes with robots: Amazon has recently bought a startup specialized in autonomous warehouse robotics for a better management of their stocks. Industrial spaces themselves are being reshaped to be optimized for robots.
Robotics is now a 50 billion USD industry, and their products are bound to increase their impact. The introduction of exoskeleton, personal care robots and telepresence devices.
Telepresence technologies will give people the chance of managing their cyber avatars remotely. You want to see something really cool? Look at the product that TELEXISTENCE, a Japanese company invented.
With the spread of 5G connection, the potential of such robots will receive a huge boost.
A further challenge personal robots creators have to overcome regards the robot to man interaction safety, a much more critical tasks when compared to the robot-environment security management. The same issue can be applied to the development of self-driving vehicles, of course.
Touch-screen devices are everywhere, from the omnipresent smartphone to cars’ infotainment systems. We use them to regulate the temperature in our homes, or to pay for the products we buy at the supermarket. You have probably experienced the struggle of dealing with a bad touchscreen, when you cannot trigger the right feature of the app you are trying to use.
The experience is slow, boring and frustrating. In order to avoid these unpleasant issues, QA testers go through the application features, running multiple type of tests over and over to collect and deliver the information necessary to debug the program. It is easy to imagine how important the functionality of touch-screen devices is, in all those contexts where people’s safety is at stake: medical devices, industrial machines, vehicles. For such products, each unit of the device should be tested before being released on the market, forcing software testers to run an enormous number of manual tests.
A very repetitive task indeed, wouldn’t it be nice if a robot could assist humans in the hard enterprise of touchscreen testing?
The goal is to provide people with a pleasant and safe user experience. To do so, a robot should interact with the touchscreen device just as a human would. Then, it should be able to recognize the app’s features, interact with them, and recognize whether a specific feature responds correctly to the command, returning the right output.
A touchscreen testing robot like MATT is provided with an end-effector with three fingers, to reproduce human multi-touch gestures, and a Computer Vision system capable of recognizing the features of the app and detecting it’s correct functionality. Watch the video, to see how such a robot tests an infotainment system for cars.