AI Powered Robotic Arm
Create New

AI Powered Robotic Arm



AI Powered Robotic Arm
AI Powered Robotic Arm

Project Rating

Overall Rating
8
0
0
0
0

Robotic arm powered with Artificial Intelligence (AI) is the most upcoming recent technologies. Robotics and automation are going to be our future. In this project, we will be learning how to train our machine and program the Robotic arm with AI model. A Robotic Arm is a type of mechanical arm, usually programmable, with similar functions of a human arm; The arm maybe sums of the mechanism or may be part of a more complex robot. In addition, robots help humans in their daily routine. Some typical examples are cleaning robot, pick and place robot etc., these robots are mainly used for industrial purposes.

Researchers from the University of California have developed a robotic arm platform, named Blue, which uses AI and deep reinforcement learning to perform intricate human tasks — such as folding towels, arraigning flowers, pouring a cup of coffee and cleaning up the mess afterward. Its arms are about the size of a bodybuilder, made of durable plastic parts, and are sensitive to outside forces — like a hand pushing it away, meaning it’s safe to work around. A new robot arm, developed by a team of researchers from UC Berkeley, is meant to change that by providing a cheap-yet-powerful platform for AI experimentation. The team likens their creation to the Apple II, the personal computer that attracted hobbyists and hackers in the 1970s and ’80s, ushering in a technological revolution. Robots and AI have evolved in parallel as areas of research for decades. In recent years, however, AI has advanced rapidly when applied to abstract problems like labeling images or playing video games. But while industrial robots can do things very precisely, they require painstaking programming and cannot adapt to even the slightest changes. Cheaper, safer robots have emerged, but most are not designed specifically to be controlled using AI software.

Another study from Rutgers University coincides with the growing trend of deploying robots to perform logistics, retail and warehouse tasks. Advances in robotics are accelerating at an unprecedented pace due to machine learning algorithms that allow for continuous experiments. Tightly packing products picked from an unorganized pile remains largely a manual task, even though it is critical to warehouse efficiency. Automating such tasks is important for companies' competitiveness and allows people to focus on less menial and physically taxing work, according to the Rutgers scientific team.

The Rutgers study focused on placing objects from a bin into a small shipping box and tightly arranging them. This is a more difficult task for a robot compared with just picking up an object and dropping it into a box. The researchers developed software and algorithms for their robotic arm. They used visual data and a simple suction cup, which doubles as a finger for pushing objects. The resulting system can topple objects to get a desirable surface for grabbing them. Furthermore, it uses sensor data to pull objects toward a targeted area and push objects together. During these operations, it uses real-time monitoring to detect and avoid potential failures. Since the study focused on packing cube-shaped objects, a next step would be to explore packing objects of different shapes and sizes. Another step would be to explore automatic learning by the robotic system after it is given a specific task.

Why: Problem statement

Normally Robotic arms are controlled by command inputs. This consumes more time and is not much effective to implement complex human tasks. So, AI and deep reinforcement controlled Robotic arms are developed which can perform complex human tasks and they are voice-controlled.

How: Solution description

We are proposing an AI powered robotic arm which uses deep learning and RCNN for detecting the user inputs and locates the object coordinates. To pick the object, we uses a  LewanSoul robotic arm of six degrees of freedom. 

Process:

Let's begin the process. Until getting the coordinates, please follow the procedure as in the project: Object Detection Challenge.

Inverse Kinematics

After getting the co-ordinates from object detection challenge, we have to find the angle theta of each servo on Robotic arm. For this, we have to find inverse kinematics using Jacobian matrix method. To get the theta angle we need to derive the Jacobian matrix equation. We need to follow the steps below:

Step 1: Kinematic Diagram

We need to find the kinematic diagram of the robotic arm.

Step 2: Rotation Matrix

We have to form the rotation matrix using above kinematic diagram.

Step 3: Displacement vectors

Step 4: Homogenous Transformation Matrix

Using the displacement vectors, we need to find the homogenous displacement matrix.

Step 5: Forward Kinematics

To use the inverse kinematics diagram, we need to find forward kinematics of the robotic arm using Denavit-Hartenberg method. we have to form the equation of forward kinematics.

Step 6:

From the above equation, we can derive the Inverse kinematics equation using jacobian Matrix method.

Serial Communication

After deriving the Inverse kinematics equation, we can find the angle of each Servo by giving the X, Y & Z coordinates as input. After getting the output as servo angle through serial communication Py serial, we will send the data to Arduino

Arduino & Servo Driver

All servo angles will be given to Arduino as input. Using the LeARM library, we can control the servo movement. Detailed explanation is given in the code snippets. Rfer the code snippets below. 

How is it different from competition

The robotic arm is voice-controlled instead of command inputs. It is human-friendly and cost-effective. Unlike other robotic arms, which uses commands, this robotic arm learns by itself using machine learning and AI. This does not need human commands always.

Who are your customers

  • Industry people,
  • Research students
  • Household people.

Project Phases and Schedule

Phase 1: Setting up the Arm.

Phase 2: Deriving Forward and Inverse kinematics equations for 6DOF.

Phase 3. Object Detection and applying the inverse kinematics equations in Python Coding.

Phase 4: Through serial communication, transferring the data.

Phase 5: Testing in real-time.

Resources Required

Hardwares:

  1. Robotic Arm with 6 Degrees of Freedom - LewanSoul
  2. Raspberry Pi module B+
  3. Arduino
  4. Camera

Softwares:

  1. Python IDE
  2. Arduino IDE
  3. Jupyter Notebook

Download:
Project Code Code copy
/* Your file Name : arm_rightV3-PickingObj-yosnalabcode.ino */
/* Your coding Language : arduino */
/* Your code snippet start here */
#include <LobotServoController.h> // library for LeARM robot
LobotServoController myse(Serial); // For serial communication 
int six,five,four,three,two,one,cm; // initializing the servo motors 
String y;
char A,B,C,D,Pos; // initializing the Postion for placing the object  
#include <Wire.h>

#define SLAVE_ADDRESS 0x04
int number[6] = {0,0,0,0,0,0};
int state = 0;
int i=1;
int z=1;
int y=1;


void setup() {
  // setup code here, to run once:
  Serial.begin(9600);
  myse.moveServos(5,1000,1,1500,2,1500,3,1500,4,1500,5,1500); // initial position of the robot arm
  delay(2000);
Wire.begin(SLAVE_ADDRESS);

Wire.onReceive(receiveData);
//delay(4000);

//Wire.onReceive(receiveData1);

Serial.println("Ready!");
}
int b,c;
char pause=' '; // waiting for serial commands

void loop()
{


  
  //Serial.println("enter the position: ");
  if(Serial.available())
  {
    Serial.print("enter the pos = ");
    
    char ch = Serial.read();// reading the serail data
    if (ch == 'A'){ // checking condition if the serial data is A position
      Serial.println('A');
      six=135;
      //five=map(cm,0,20,500,2500);
     myse.moveServo(5,1800,1000); 
     //four=map(cm,0,20,500,1500);
     myse.moveServo(4,700,1000); 
      //three=map(cm,0,20,550,1500);
     myse.moveServo(3,1000,1000); 
     six=map(six,0,180,500,2500);
      myse.moveServo(6,six,1000); 
      delay(2000);
      myse.moveServo(1,1500,1000);
       delay(1000);
       //six=135;
       //six=map(six,0,180,500,2500);
       //myse.moveServo(6,six,1000);
       myse.moveServo(5,1800,1000);
       delay(1000);
       myse.moveServo(4,780,1000);
       delay(1000);
       myse.moveServo(3,766,1000);
       delay(1000);
       myse.moveServo(1,2500,1000);
       delay(1000);
       myse.moveServos(5,1000,1,2500,2,1500,3,1500,4,1500,5,1500);
      
      }
    else if(ch == 'B')  
    {
      Serial.println('B');
      six=45;
      //five=map(cm,0,20,500,2500);
     myse.moveServo(5,1800,1000); 
     //four=map(cm,0,20,500,1500);
     myse.moveServo(4,700,1000); 
      //three=map(cm,0,20,550,1500);
     myse.moveServo(3,1000,1000); 
     six=map(six,0,180,500,2500);
      myse.moveServo(6,six,1000); 
      
      }
      else if(ch =='C')
    {
      Serial.println('C');
      six=0;
      //five=map(cm,0,20,500,2500);
     myse.moveServo(5,1800,1000); 
     //four=map(cm,0,20,500,1500);
     myse.moveServo(4,700,1000); 
      //three=map(cm,0,20,550,1500);
     myse.moveServo(3,1000,1000); 
     six=map(six,0,180,500,2500);
      myse.moveServo(6,six,1000); 
      
      }
      else if(ch=='D')
    {
      Serial.println('D');
      six=180;
      //five=map(cm,0,20,500,2500);
     myse.moveServo(5,1800,1000); 
     //four=map(cm,0,20,500,1500);
     myse.moveServo(4,700,1000); 
      //three=map(cm,0,20,550,1500);
     myse.moveServo(3,1000,1000); 
     six=map(six,0,180,500,2500);
      myse.moveServo(6,six,1000); 
      
      }
      
   }
}
void receiveData(int byteCount){
while(Wire.available()) {
number[i] = Wire.read();


if(number[i]==150)
{
 if(number[0]>90)
   number[0]=-(255-number[0]);//subtract the values from 255 to get the transmitted value
  if(number[1]>90)
   number[1]=-(255-number[1]);//subtract the values from 255 to get the transmitted value
  if(number[2]>90)
   number[2]=-(255-number[2]);//subtract the values from 255 to get the transmitted value
  if(number[3]>90)
   number[3]=-(255-number[3]);//subtract the values from 255 to get the transmitted value

 
number[0]=map(number[0],-90,90,2500,500);//map the theta 1 values for servo 2
number[1]=map(number[1],-90,90,500,2500);//map the theta 2 values for servo 3
number[2]=map(number[2],-90,90,2500,500);//map the theta 3 values for servo 3
number[3]=map(number[3],-90,90,500,2500);//map the theta 4 values for servo 5
 
 Serial.print(number[0]);
 Serial.print(" "); 
  Serial.print(number[1]);
 Serial.print(" "); 
 Serial.print(number[2]);
 Serial.print(" "); 
 Serial.print(number[3]);
 myse.moveServos(4,500,2,number[0],3,number[1],4,number[2],5,number[3]);//moves servos 2,3,4,5 to desired location
 delay(4000);
 
 myse.moveServo(1,2240,500);//moves servo 1
break;
 
}
i=i+1;

}
}

Comments

Leave a Comment

Post a Comment

About This Project

Project period

09/30/2019 - 10/29/2019

Views

108

Team Members

Courses

Get trained in Learny Technologies and develop your own applications.

New batch starts from Nov 13th.