0% found this document useful (0 votes)
5 views

Sahil Kumar 1

Uploaded by

SAHIL RAJPUT
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Sahil Kumar 1

Uploaded by

SAHIL RAJPUT
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

INTERNSHIP REPORT

ASSESSMENT OF ORGANISATION STUDY OF IIOT


Submitted for partial fulfilment of the requirements for the award of
B.Tech IN
ELECTRONICS AND COMMUNICATION ENGINEERING
SAHIL KUMAR [21104133021]
Under the guidance of
Dr. BRAJESH KUMAR
HOD, Department of Electronics and Communication Engineering

Department of Electronics and Communication Engineering


GOVERNMENT ENGINEERING COLLEGE JAMUI
AMRATH,JAMUI,811313

[Affiliated to The Institute is academically governed by Bihar Engineering


University (BEU), Patna which is the degree awarding authority for the
institute as B.Tech. and is recognized by the A.I.C.T.E., New Delhi, and the
Government of Bihar State.

1
Internship Report
On
INDUSTRIAL INTERNET OF THINGS (IIOT)
Submitted for partial fulfilment of the requirements for the award of
B.Tech IN
ELECTRONICS AND COMMUNICATION ENGINEERING
Submitted by
SAHIL KUMAR
Semester : 5th

Reg.No: 21104133021

Internship Carried Out


at
ROBOMANTHAN Pvt. Ltd
50, JKN Arcade, Ground Floor, 27th Main, 1st Cross Rd, BTM 1st Stage,
Bengaluru, Karnataka 560 068

Internal Guide
External Guid
Dr. BRAJESH KUMAR
SAURAV KUMAR
HOD Chief Executive Officer

ii
DEPARTMENT OF TECHNICAL EDUCATION
GEC JAMUI
ELECTRONICS AND COMMUNICATION ENGINEERING

CERTIFICATE
This is to certify that Name: SAHIL KUMAR with Reg.No: 21104133021 has
satisfactorily submitted the internship report titled “INDUSTRIAL
INTERNETOF THINGS (IIOT)” in partial fulfillment of the requirements as
prescribed by DEPARTMENT OF TECHNICAL EDUCATION for V Semester
in ELECTRONICS AND COMMUNICATION ENGINEERING and
submitted this report during the academic year 2024-2025.

Dr. BRAJRSH
KUMAR

Head of the
Department

EXTERNAL VIVA

Name of the Examiners Signature with Date

1.

2.

iii
ACKNOWLEDGEMENT

The satisfaction and euphoria that accompanies successful completion of any task
would be incomplete, without mentioning of the people who made it possible and
whose constant guidance and encouragement served as a beacon light and crowned
our efforts with success.

I take it as a privilege to express through this page of my internship report a few


words of cordial gratitude and respect to all those who guided and inspired me at
every step towards the completion of my internship.

First of all, 1 would like to thank the management of ROBOMANTHAN Pvt. Ltd
for providing me an opportunity to undergo industrial training in their company.

1 would like to convey my gratitude and thanks to the company's Chief Executive
Officer and Industry Training Supervisor Mr. SAURAV KUMAR, for guiding me
and for all the support and guidance during the training period as my mentor.

My sincere thanks to my internship guide Dr BRAJESH KUMAR, Head of The


Department, Electronics and Communication Engineering, GEC Jamui, for her
encouragement and guidance throughout the internship.

I express my indebtedness and sincere gratitude to the Head of the department


Dr. BRAJESH KUMAR.
.

I would like to thank my “Parents” who are the first God and who has given
blessings, strength, and stayed with me throughout the successful completion of the
report.

iv
DECLARATION

I SAHIL KUMAR bearing Reg.No: 21104133021, student of B.Tech in


ELECTRONICS AND COMMUNICATION ENGINEERING,
GEC JAMUI hereby declare that I caried out my internship at
ROBOMANTHAN Pvt. Ltd and prepared the internship report entitled
“INDUSTRIAL INTERNETOF THINGS (IIOT)” under guidance of Dr.
BRAJESH KUMAR (internal guide) and MR. SAURAV KUMAR - Chief
Executive Officer (external guide). This report has not submitted to any other
Organization / University for any award of diploma/degree or certificate.

Date:18/06/2024 SAHIL KUMAR


Place: JAMUI Reg. No.: 21104133021

v
Executive Summary

During my internship at RoboManthan, I had the opportunity to engage in a variety


of projects and activities that significantly contributed to my professional
development and understanding of futuristic technologies. RoboManthan's
commitment to empowering young individuals with practical knowledge in
robotics, IoT, AI, and other emerging fields is commendable, and I am grateful for
the valuable experiences I gained during my time there.

One of the key highlights of my internship was the chance to work on the
development of educational content and workshops aimed at bridging the gap
between academia and industry. This experience not only enhanced my technical
skills but also gave me a deeper insight into the challenges and opportunities in
engineering education today.

Additionally, I had the privilege of collaborating with a talented team of


professionals who were dedicated to creating innovative solutions and empowering
the next generation of technologists. Their guidance and mentorship were
invaluable, and I am grateful for the knowledge and skills I acquired through their
support.

Overall, my internship at Robomanthan was a rewarding experience that has


equipped me with the skills, knowledge, and confidence to pursue a successful
career in the field of futuristic technologies. I am thankful for the opportunity to be
a part of such a dynamic and forward-thinking organization, and I look forward to
applying the lessons learned during my internship to future endeavors.

vi
CONTENTS

Sl. No TITLE Page No.


1. Chapter 1: Introduction to IOT
• Features of IOT
• Advantages of IOT
• Disadvantages of IOT
• Application grounds of IOT
• IOT technologies and protocols
• IOT software

2. Chapter 2: Description of OJT -1 Project


• Introduction
• Interface L298N DC Motor Driver Module
With ESP32
• L298N Motor Driver Module
• Arduino Sketch
• Demonstration
• Conclusion
3. Chapter 3: Description of OJT -2
• Introduction
• Objectives Of The Project
• Project Overview
• Demonstration
• Conclusion

4. Chapter 4: Description of use case-1, use case -2


• Use case-1 Flow Diagram
• Use case-2 Flow Diagram

vii
LIST OF FIGURES

Sl. No TITLE

Chapter 1:
1.
1. Working of IOT enables care devices
2. IOT controlled greenhouse environment
Chapter 2: Description of OJT -1 Project
1. Figure 2.1 L298N Motor Driver Module
2. Figure 2.2 L298N Module Specification

3. Figure 2.3 Speed Control (ENABLE)Pins

4. Figure 2.4 Circuit Diagram of Mobile Controlled


ESP32 Two-Wheel Drive Robot

5. Figure 2.5 MIT App Inventor for App


Development
6. Figure 2.6 App development programming
(Blocks) of Mobile Controlled ESP32 Two- Wheel
Drive Robot
7. Figure 2.7 Snip of User Interface of Mobile

Controlled ESP32 Two-Wheel Drive Robot

8. Figure:2.8 User Interface of Mobile Controlled


ESP32 Two-Wheel Drive Robot
9. Figure:2.9 Mobile Controlled ESP32 Two-Wheel
Drive Robot wiring
10. Figure:3.0 Mobile Controlled ESP32 Two-Wheel
Drive Robot working ON

viii
LIST OF FIGURES

Sl. No TITLE

Chapter 3:

1. Figure 3.1 Arduino Pin out diagram


2. Figure 3.2 2-Channel relay Pin out diagram
3. Figure 3.3 Arduino IDE Logo
4. Figure 3.4 PyCharm Logo
5. Figure 3.5 OpenCV Logo
6. Figure 3.6 Circuit Diagram of Home Automation _AI
with all the Interfacing Module’s
7. Figure 3.7 Home Automation _AI Face Not Detected
8. Figure 3.8 Home Automation _AI Face Detected

Chapter 4:

1. Figure 4.1 Use case-1 flow diagram


2. Figure 4.2 Use case-2 flow diagram
CHAPTER 1

1.1 IOT (INTERNET OF THINGS)

IOT as a term has evolved long way as a result of convergence of multiple technologies, machine learning, embedded
systems and commodity sensors. IOT is a system of interconnected devices assigned a UIDS, enabling data transfer and
control of devices over a network. It reduced the necessity of actual interaction in order to control a device. IOT is an
advanced automation and analytics system which exploits networking, sensing, big data, and artificial intelligence
technology to deliver complete systems for a product or service. These systems allow greater transparency, control, and
performance when applied to any industry or system
.
1.1.1 Features of IOT
1.1.1.1 Intelligence
IOT comes with the combination of algorithms and computation, software & hardware that makes it smart. Ambient
intelligence in IOT enhances its capabilities which facilitate the things to respond in an intelligent way to a particular
situation and supports them in carrying out specific tasks. In spite of all the popularity of smart technologies, intelligence
in IOT is only concerned as a means of interaction between devices, while user and device interaction are achieved by
standard input methods and graphical user interface

1.1.1.2 Connectivity
Connectivity empowers the Internet of Things by bringing together everyday objects. Connectivity of these objects is
pivotal because simple object level interactions contribute towards collective intelligence in the IOT network. It enables
network accessibility and compatibility in the things. With this connectivity, new market opportunities for the Internet of
things can be created by the networking of smart things and applications

1.1.1.3 Dynamic Nature


The primary activity of Internet of Things is to collect data from its environment, this is achieved with the dynamic changes
that take place around the devices. The state of these devices change dynamically, example sleeping and waking up,
connected and/or disconnected as well as the context of devices including temperature, location and speed. In addition to
the state of the device, the number of devices also changes dynamically with a person, place and time

1.1.1.4 Enormous Scale


The number of devices that need to be managed and that communicate with each other will be much larger than the devices
connected to the current Internet. The management of data generated from these devices and their interpretation for
application purposes becomes more critical. Gartner (2015) confirms the enormous scale of IOT in the estimated report
where it stated that 5.5 million new things will get connected every day and 6.4 billion connected things will be in use
worldwide in 2016, which is up by 30 percent from 2015. The report also forecasts that the number of connected devices
will reach 20.8 billion by 2020
1.1.1.5 Sensing
IOT wouldn’t be possible without sensors that will detect or measure any changes in the environment to generate data that
can report on their status or even interact with the environment. Sensing technologies provide the means to create
capabilities that reflect a true awareness of the physical world and the people in it. The sensing information is simply the
analog input from the physical world, but it can provide a rich understanding of our complex world

1.1.1.6 Heterogeneity
Heterogeneity in Internet of Things as one of the key characteristics. Devices in IOT are based on different hardware
platforms and networks and can interact with other devices or service platforms through different networks. IOT architecture
should support direct network connectivity between heterogeneous networks. The key design requirements for
heterogeneous things and their environments in IOT are scalabilities, modularity, extensibility and interoperability.

1.1.1.7 Security
IOT devices are naturally vulnerable to security threats. As we gain efficiencies, novel experiences, and other benefits from
the IOT, it would be a mistake to forget about security concerns associated with it. There is a high level of transparency and
privacy issues with IOT. It is important to secure the endpoints, the networks, and the data that is transferred across all of it
means creating a security paradigm.

1.1.2 Advantages of IOT

1.1.2.1 Communication
IOT encourages the communication between devices, also famously known as Machine-to-Machine (M2M)
communication. Because of this, the physical devices are able to stay connected and hence the total transparency is available
with lesser inefficiencies and greater quality.

1.1.2.2 Automation and Control


Due to physical objects getting connected and controlled digitally and centrally with wireless infrastructure, there is a large
amount of automation and control in the workings. Without human intervention, the machines are able to communicate with
each other leading to faster and timely output.

1.1.2.3 Information
It is obvious that having more information helps making better decisions. Whether it is mundane decisions as needing to
know what to buy at the grocery store or if your company has enough widgets and supplies, knowledge is power and more
knowledge is better.
1.1.2.4 Monitor
The second most obvious advantage of IOT is monitoring. Knowing the exact quantity of supplies or the air quality in your
home, can further provide more information that could not have previously been collected easily. For instance, knowing
that you are low on milk or printer ink could save you another trip to the store in the near future. Furthermore, monitoring
the expiration of products can and will improve safety.

1.1.2.5 Time
As hinted in the previous examples, the amount of time saved because of IOT could be quite large. And in today’s modern
life, we all could use more time.

1.1.2.6 Money
The biggest advantage of IOT is saving money. If the price of the tagging and monitoring equipment is less than the amount
of money saved, then the Internet of Things will be very widely adopted. IOT fundamentally proves to be very helpful to
people in their daily routines by making the appliances communicate to each other in an effective manner thereby saving
and conserving energy and cost. Allowing the data to be communicated and shared between devices and then translating it
into our required way, it makes our systems efficient.

1.1.2.7 Automation of daily tasks leads to better monitoring of devices

The IOT allows you to automate and control the tasks that are done on a daily basis, avoiding human intervention. Machine-
to-machine communication helps to maintain transparency in the processes. It also leads to uniformity in the tasks. It can
also maintain the quality of service. We can also take necessary action in case of emergencies.

1.1.2.8 Efficient and Saves Time


The machine-to-machine interaction provides better efficiency, hence; accurate results can be obtained fast. This results in
saving valuable time. Instead of repeating the same tasks every day, it enables people to do other creative jobs.

1.1.2.9 Saves Money


Optimum utilization of energy and resources can be achieved by adopting this technology and keeping the devices under
surveillance. We can be alerted in case of possible bottlenecks, breakdowns, and damages to the system. Hence, we can
save money by using this technology.

1.1.2.10 Better Quality of Life


All the applications of this technology culminate in increased comfort, convenience, and better management, thereby
improving the quality of life.
1.1.3 Disadvantages of IOT
1.1.3.1 Compatibility
Currently, there is no international standard of compatibility for the tagging and monitoring equipment. I believe this
disadvantage is the most easy to overcome. The manufacturing companies of these equipment just need to agree to a
standard, such as Bluetooth, USB, etc. This is nothing new or innovative needed.

1.1.3.2 Complexity
As with all complex systems, there are more opportunities of failure. With the Internet of Things, failures could sky rocket.
For instance, let’s say that both you and your spouse each get a message saying that your milk has expired, and both of you
stop at a store on your way home, and you both purchase milk. As a result, you and your spouse have purchased twice the
amount that you both need. Or maybe a bug in the software ends up automatically ordering a new ink cartridge for your
printer each and every hour for a few days, or at least after each power failure, when you only need a single replacement.

1.1.3.3 Privacy/Security
With all of this IOT data being transmitted, the risk of losing privacy increases. For instance, how well encrypted will the
data be kept and transmitted with? Do you want your neighbours or employers to know what medications that you are taking
or your financial situation?

1.1.3.4 Safety
Imagine if a notorious hacker changes your prescription. Or if a store automatically ships you an equivalent product that
you are allergic to, or a flavour that you do not like, or a product that is already expired. As a result, safety is ultimately in
the hands of the consumer to verify any and all automation. As all the household appliances, industrial machinery, public
sector services like water supply and transport, and many other devices all are connected to the Internet, a lot of information
is available on it. This information is prone to attack by hackers. It would be very disastrous if private and confidential
information is accessed by unauthorized intruders.

1.1.3.5 Lesser Employment of Menial Staff


The unskilled workers and helpers may end up losing their jobs in the effect of automation of daily activities. This can lead
to unemployment issues in the society. This is a problem with the advent of any technology and can be overcome with
education. With daily activities getting automated, naturally, there will be fewer requirements of human resources,
primarily, workers and less educated staff. This may create Unemployment issue in the society.

1.1.3.6 Technology Takes Control of Life


Our lives will be increasingly controlled by technology, and will be dependent on it. The younger generation is already
addicted to technology for every little thing. We have to decide how much of our daily lives are we willing to mechanize
and be controlled by technology.
1.1.4 Application Grounds of IOT

1.1.4.1 Wearables
Wearable technologies is a hallmark of IOT applications and is one of the earliest industries to have deployed IOT at its
services. Fit Bits, heart rate monitors, smartwatches, glucose monitoring devices reflect the successful applications of IOT.

1.1.4.2 Smart homes


This area of application concerned to this particular project, so a detailed application is discussed further. Jarvis, an AI home
automation employed by Mark Zuckerberg, is a remarkable example in this field of application.

1.1.4.3 Health care


IOT applications have turned reactive medical based system into proactive wellness based system. IOT focuses on creating
systems rather than equipment. IOT creates a future of medicine and healthcare which exploits a highly integrated network
of sophisticated medical devices. The integration of all elements provides more accuracy, more attention to detail, faster
reactions to events, and constant improvement while reducing the typical overhead of medical research and organizations
Figure

1.1.4.4 Agriculture
A greenhouse farming technique enhances the yield of crops by controlling environmental parameters.
However, manual handling results in production loss, energy loss, and labour cost, making the process less
effective. A greenhouse with embedded devices not only makes it easier to be monitored but also, enables us to
control the climate inside it. Sensors measure different 18 | Page parameters according to the plant requirement
and send it to the cloud. It, then, processes the data and applies a control action.
1.1.4.5 Industrial Automation
For a higher return of investment this field requires both fast developments and quality of products. This vitality thus coined
the term IIOT. This whole schematic is re-engineered by IOT applications. Following are the domains of IOT applications
in industrial automation
• Factory Digitalization
• Product flow Monitoring
• Inventory Management
• Safety and Security
• Quality Control
• Packaging optimization
• Logistics and Supply Chain Optimization

1.1.4.6 Government and Safety


IOT applied to government and safety allows improved law enforcement, defence, city planning, and economic
management. The technology fills in the current gaps, corrects many current flaws, and expands the reach of these efforts.
For example, IOT can help city planners have a clearer view of the impact of their design, and governments have a better
idea of the local economy.

1.1.5 IOT Technologies and Protocols


Several communication protocols and technologies cater to and meet the specific functional requirements of IOT system.

1.1.5.1 Bluetooth
Bluetooth is a short range IOT communication protocol/technology that is profound in many consumer product markets and
computing. It is expected to be key for wearable products in particular, again connecting to the IOT albeit probably via a
smartphone in many cases. The new Bluetooth Low-Energy (BLE) – or Bluetooth Smart, as it is now branded – is a
significant protocol for IOT applications. Importantly, while it offers a similar range to Bluetooth it has been designed to
offer significantly reduced power consumption.

1.1.5.2 Zigbee
ZigBee is similar to Bluetooth and is majorly used in industrial settings. It has some significant advantages in complex
systems offering low-power operation, high security, robustness and high and is well positioned to take advantage of
wireless control and sensor networks in IOT applications. The latest version of ZigBee is the recently launched 3.0, which
is essentially the unification of the various ZigBee wireless standards into a single standard.

1.1.5.3 Z-Wave
Z-Wave is a low-power RF communications IOT technology that primarily design for home automation for products such
as lamp controllers and sensors among many other devices. A Z Wave uses a simpler protocol than some others, which can
enable faster and simpler development, but the only maker of chips is Sigma Designs compared to multiple sources for
other wireless technologies such as ZigBee and others.

1.1.5.4 Wi-Fi
Wi-Fi connectivity is one of the most popular IOT communication protocol, often an obvious choice for many developers,
especially given the availability of Wi-Fi within the home environment within LANs. There is a wide existing infrastructure
as well as offering fast data transfer and the ability to handle high quantities of data. Currently, the most common Wi-Fi
standard used in homes and many businesses is 802.11n, which offers range of hundreds of megabit per second, which is
fine for file transfers but may be too power-consuming for many IOT applications.

1.1.5.5 Cellular
Any IOT application that requires operation over longer distances can take advantage of GSM/3G/4G cellular
communication capabilities. While cellular is clearly capable of sending high quantities of data, especially for 4G, the cost
and also power consumption will be too high for many applications. But it can be ideal for sensor-based low-bandwidth-
data projects that will send very low amounts of data over the Internet .

1.1.5.6 NFC
NFC (Near Field Communication) is an IOT technology. It enables simple and safe communications between electronic
devices, and specifically for smartphones, allowing consumers to perform transactions in which one does not have to be
physically present. It helps the user to access digital content and connect electronic devices. Essentially it extends the
capability of contactless card technology and enables devices to share information at a distance that is less than 4cm.
1.1.5.7 LoRaWAN
LoRaWAN is one of popular IOT Technology, targets wide-area network (WAN) applications. The LoRaWAN design to
provide low-power WANs with features specifically needed to support low-cost mobile secure communication in IOT,
smart city, and industrial applications. Specifically meets requirements for low-power consumption and supports large
networks with millions and millions of devices, data rates range from 0.3 kbps to 50 kbps.

1.1.6 IOT software


IOT software addresses its key areas of networking and action through platforms, embedded systems, partner systems, and
middleware. These individual and master applications are responsible for data collection, device integration, real-time
analytics, and application and process extension within the IOT network. They exploit integration with critical business
systems (e.g., ordering systems, robotics, scheduling, and more) in the execution of related tasks.

1.1.6.1 Data Collection


This software manages sensing, measurements, light data filtering, light data security, and aggregation of data. It uses
certain protocols to aid sensors in connecting with real-time, machine to-machine networks. Then it collects data from
multiple devices and distributes it in accordance with settings. It also works in reverse by distributing data over devices.
The system eventually transmits all collected data to a central server.

1.1.6.2 Device Integration


Software supporting integration binds (dependent relationships) all system devices to create the body of the IOT system. It
ensures the necessary cooperation and stable networking between devices. These applications are the defining software
technology of the IOT network because without them, it is not an IOT system. They manage the various applications,
protocols, and limitations of each device to allow communication.

1.1.6.3 Real-Time Analytics


These applications take data or input from various devices and convert it into feasible actions or clear patterns for human
analysis. They analyse information based on various settings and designs in order to perform automation-related tasks or
provide the data required by industry.

1.1.6.4 Application and Process Extension


These applications extend the reach of existing systems and software to allow a wider, more effective system. They integrate
predefined devices for specific purposes such as allowing certain mobile devices or engineering instruments access.
CHAPTER 2

OJT-1 PROJECT

“MOBILE CONTROLLED ESP32 TWO MOTORS”

2.0 INTRODUCTION
• This robot is a simple yet powerful device that demonstrates the capabilities
of the ESP32 microcontroller. It’s designed with two motors, allowing it to
move in various directions - forward, backward, left, and right.
• The heart of this robot is the ESP32 microcontroller, a highly integrated chip
with both Wi-Fi and Bluetooth capabilities. It’s paired with an L298N Motor
Driver, which handles the low-level details of controlling the speed and
direction of the motors.
• The robot is powered by a 3-12V power supply. The motors are connected
to the motor driver, which in turn is connected to the ESP32. The ESP32
receives commands and controls the motors accordingly.
• To build this robot, you’ll need a two-motor robot chassis, an ESP32
microcontroller, an L298N motor driver, connecting wires, and a power
supply. You’ll also need to install the Arduino IDE for programming the
ESP32.
• This two-motor robot is a great project for anyone interested in robotics,
microcontrollers, or wireless communication. It’s a stepping stone to
creating more complex robots based on the ESP32 and the L298N motor
driver.

2.1.1 Interface L298N DC Motor Driver Module with ESP32

• In this tutorial, we will learn to interface L298N Motor Driver with ESP32.
This is an in-depth guide about the L298N motor driver including its
specifications, pinout, interfacing with ESP32 board. Firstly, we will see an
example to control DC motor speed. In the end, we will see an example to
control the direction of a DC motor using an L298N motor driver.
2.1.2 Components Used

• ESP32
• L298N Motor Driver
• Dual Shaft BO Motor
• BO Motor Mount
• BO Wheel
• Caster Wheel
• 3.7V 2600mAh Lithium-Ion Battery
• Acrylic Base Plate
• Standoffs
• Jumper Cable
• Nuts and Bolts
• Smartphone or tablet: Used to run the MIT App Inventor app for controlling
the home automation system.

2.1.3 L298N Motor Driver Module

• The L298N motor driver module is very easy to use with microcontrollers and
relatively inexpensive as well.
• It is widely used in controlling robots as we can connect up to four motors at
once but if we want to control the speed and direction as well then it allows two
motors to be connected.
• Thus, it is perfect for two-wheeled robots. This module is mainly used in
robotics and in controlling dc and stepping motors.
Figure:2.1 L298N Motor Driver Module

2.1.4 L298N Module Specifications

• the pinout of the module

Figure:2.2 L298N Module Specification

2.1.5 Controlling DC motors through L298N Driver Module


• Let us now see the details behind controlling the dc motor through
the L298N module.

2.1.6 Control Pins

• There are two types of control pins found at the bottom right side of the module.
One type controls the speed and the other type controls the direction of the
motor.
2.1.7 Speed Control (ENABLE) Pins

• The speed control pins labelled ENA and ENB on the module, control the
speed of the dc motor and turn it ON and OFF.

Figure:2.3 Speed Control (ENABLE)Pins

2.1.8 Interface L298N DC Motor Driver with ESP32

Figure:2.4 Circuit Diagram of Mobile Controlled ESP32 Two-Wheel Drive Robot


• To control the dc motor through the motor driver, we have done a demonstration
by showing how to control two DC motors using this driver.
• We used motor A output pins to control this motor. Thus, ENA will set the speed
and IN1 and IN2 will set the spinning direction of the motor. In ESP32, PWM is
supported through all output pins GPIO only. So, we chose an GPIO pin to connect
with the enable pin of the L298N motor driver.
• We have used GPIO13 as the PWM pin. In the above schematic, we can see that
GPIO13 is connected with ENA, and IN1 and IN2 are connected with GPIO5 and
GPIO4 respectively.
• We chose appropriate GPIO pins when connecting the ESP32 board and the driver
module together.
• The dc motor is rated at 6-12V, and requires a large amount of current to start.
This is why we will be using an external power source for the dc motor. As we can
use any power source ranging from 6-12V, we will incorporate a 9V battery in our
case. You can use any other power source.
• We are keeping the 5V Enable jumper in its place as it will power up the L298N
motor driver.

2.1.9 MIT App Inventor for App Development


MIT App Inventor is a visual programming environment that allows for the
development of Android apps without the need for traditional coding. It provides a
simple drag-and-drop interface for designing app layouts and programming app
behaviour. In this project, MIT App Inventor was used to create a user-friendly app
for controlling the home automation system.
2.2.0 MIT App Inverter

Figure:2.5 MIT App Inventor for App Development

2.2.1 Arduino Sketch

char x;
#include <BluetoothSerial.h>
#include <esp_now.h>
#include <WiFi.h>
#if !defined(CONFIG_BT_ENABLED) ||
!defined(CONFIG_BLUEDROID_ENABLED)
#error Bluetooth is off please turn on
#endif
BluetoothSerial SerialBT;
int motor1Pin1 = 13; // Motor 1, Forward
int motor1Pin2 = 12; // Motor 1, Backward
int motor2Pin1 = 14; // Motor 2, Forward
int motor2Pin2 = 27; // Motor 2, Backward
void setup () {
Serial.begin(115200);
SerialBT.begin("ESP32test"); //Bluetooth device name
Serial.println("robot control is active, now you can pair with Bluetooth");

7
pinMode (motor1Pin1, OUTPUT);
pinMode (motor1Pin2, OUTPUT);
pinMode (motor2Pin1, OUTPUT);
pinMode (motor2Pin2, OUTPUT);
}
void loop ()
{
if (SerialBT.available())
{
x = SerialBT.read();
Serial.println(x);

if (x =='F')
{
digitalWrite(motor1Pin1,HIGH);
digitalWrite(motor1Pin2,LOW);
digitalWrite(motor2Pin1,HIGH);
digitalWrite(motor2Pin2,LOW);
Serial.println("forward");
}
if (x =='B')
{
digitalWrite (motor1Pin1, LOW);
digitalWrite (motor1Pin2, HIGH);
digitalWrite (motor2Pin1, LOW);
digitalWrite (motor2Pin2, HIGH);
Serial.println("backward");
}
if (x =='R')
{
digitalWrite(motor1Pin1,HIGH);
digitalWrite(motor1Pin2,LOW);
digitalWrite(motor2Pin1,LOW);
digitalWrite(motor2Pin2,HIGH);
Serial.println("left");
}
if (x =='L')
{
digitalWrite(motor1Pin1, LOW);
digitalWrite(motor1Pin2, HIGH);
digitalWrite(motor2Pin1, HIGH);
digitalWrite(motor2Pin2, LOW);
Serial.println("right");
}
if (x =='S')
{
digitalWrite(motor1Pin1,0);
digitalWrite(motor1Pin2,0);
digitalWrite(motor2Pin1,0);
digitalWrite(motor2Pin2,0);
Serial.println("STOP");
}}}

2.2.2 App Development Programming

Using MIT App Inventor, a custom Android app was developed to provide a user
interface for controlling the Mobile Controlled ESP32 Two-Wheel Drive Robot The
app allows users to turn devices on or off, adjust settings, and monitor the status of
connected devices remotely. This methodology provided a systematic approach to
designing, implementing, and testing the home automation system, ensuring its
functionality and usability for end-users.
2.2.3 App development programming (Blocks)

Figure:2.6 App development programming (Blocks) of Mobile Controlled ESP32


Two- Wheel Drive Robot.

2.3 Demonstration

Figure:2.7 Snip of User Interface of Mobile Controlled ESP32 Two- Wheel Drive
Robot
4|Page
Figure:2.8 User Interface of Mobile Controlled ESP32 Two-
Wheel Drive Robot

Figure:2.9 Mobile Controlled ESP32 Two-Wheel Drive Robot wiring


Figure:3.0 Mobile Controlled ESP32 Two-Wheel Drive Robot working ON

2.4 Conclusion
• By following these steps, you should have a functional system where the
ESP32 controls two motors based on commands received from the MIT
App Inventor mobile app.
Chapter 3

OJT-2

"SMART HOME AUTOMATION SYSTEM WITH


FACE DETECTION USING ARDUINO UNO AND
OPENCV”

3.1. INTRODUCTION
In today's rapidly advancing technological landscape, the concept of a smart home
has become increasingly prevalent, offering homeowners enhanced convenience,
security, and energy efficiency. This project endeavors to bring this futuristic vision
to life by integrating cutting-edge technologies such as Arduino Uno, OpenCV for
AI face detection, and Pyfirmata for seamless communication between Python code
and the Arduino Uno board. The system's core functionality revolves around its
ability to intelligently detect human faces using the OpenCV library, specifically
the haarcascade_frontalface_default XML file, and then trigger actions through the
Arduino Uno, such as controlling AC loads using a 2-channel relay.

The Arduino Uno serves as the central hub of the system, interfacing with various
hardware components and executing commands based on the input received from
the face detection algorithm. This allows for a dynamic and responsive environment
where the automation system can adapt to the presence or absence of individuals in
the vicinity. Furthermore, the system is designed to be easily accessible, requiring
only a USB connection to a laptop or PC, making it suitable for a wide range of
home environments.

By leveraging the power of AI and IoT technologies, this project not only
showcases the potential for enhanced automation and convenience in modern
homes but also serves as a practical demonstration of how these technologies can
be integrated to create intelligent, interactive systems.
Through this project, we aim to inspire further innovation in the field of smart
home automation and contribute to the ongoing evolution of the connected home
ecosystem.

3.2. OBJECTIVES OF THE PROJECT


• Integrate Arduino Uno with OpenCV for AI face detection.
• Utilize Pyfirmata for running Python code on Arduino Uno.
• Control AC loads using a 2-channel relay.
• Establish USB connectivity between Arduino Uno and a laptop/PC.
• Create a smart home automation system that responds to detected faces.

3.3. Project Overview:


3.3.1. Components Used:
• Arduino Uno
• 2-channel Relay
• Laptop/PC with Arduino IDE, PyCharm Community Edtion 2023.2.5v
• OpenCV Library
• Pyfirmata Library
• Haarcascade_frontalface_default XML file
• USB Cable, Jumper wire

3.3.1.1 Arduino Uno:


Arduino Uno, a popular microcontroller board, serves as the cornerstone of
numerous electronics projects due to its versatility, ease of use, and robust
community support. Developed by Arduino LLC, the Uno model is widely
acclaimed for its accessibility to beginners while offering advanced capabilities
for seasoned enthusiasts and professionals alike.

Arduino Uno simplifies the process of prototyping and creating electronic devices
by providing a user-friendly platform for programming and interfacing with a
variety of sensors, actuators, and other peripherals. It’s mostly used for
educational and commercial projects.

Specifications:
• Microcontroller: Atmel ATmega328P
• Operating Voltage: 5V
• Input Voltage (recommended): 7-12V
• Input Voltage (limit): 6-20V
• Digital I/O Pins: 14 (of which 6 provide PWM output)
• Analog Input Pins: 6
• DC Current per I/O Pin: 20 mA
• DC Current for 3.3V Pin: 50 mA
• Flash Memory: 32 KB (0.5 KB used by bootloader)
• SRAM: 2 KB
• EEPROM: 1 KB
• Clock Speed: 16 MHz

Figure 3.1 Arduino Pin out diagram


3.3.1.2 2-channel Relay

A 2-channel relay is an electromechanical switch that can control multiple electrical


circuits using a single control signal. Each relay channel is capable of switching a
separate electrical circuit on or off, making it a versatile component in various
applications, including home automation, industrial control systems, and robotics.

Specifications:

• Number of Channels: 2
• Operating Voltage: Typically, 5V or 12V
• Maximum Switching Voltage: Varies depending on the relay model,
commonly up to 250V AC or 30V DC per channel
• Maximum Switching Current: Usually up to 10A per channel
• Control Signal: Typically requires a low current (e.g., 20-30mA) control
signal to switch the relay
• Type: Normally Open (NO) and Normally Closed (NC) contacts
• Connection: Commonly uses screw terminals or pin headers for easy wiring
• Applications: Used in home automation for controlling lights, fans, and
other appliances; in industrial control systems for controlling machinery;
and in robotics for switching power to motors or other devices.

Figure 3.2 2-Channel relay Pin out diagram


3.3.1.3 Arduino IDE
The Arduino Integrated Development Environment (IDE) is an open-source
software application that allows you to write, compile, and upload code to Arduino
compatible boards. It provides a user-friendly interface and a set of tools that
simplify the process of developing and uploading code for Arduino projects.

Key Features of the Arduino IDE:


• An open-source software used to write and upload code to Arduino compatible boards.
• Provides a user-friendly interface for programming Arduino projects.
• Works on Windows, macOS, and Linux operating systems.
• Supports various Arduino boards such as Uno, Mega, Nano, etc.
• Uses a simplified version of C++ programming language.
• Easy to learn and suitable for beginners.
• Source code is freely available for modification and improvement.
• Encourages innovation and customization.
• Compatible with version control systems like Git for managing project versions.
• Facilitates collaboration in team projects.
• Syntax highlighting
• Serial monitor
• Built-in libraries

Figure 3.3 Arduino IDE Logo


3.3.1.4 PyCharm Community Edition 2023.2.5v:
PyCharm Community Edition is a free, open-source integrated development
environment (IDE) specifically designed for Python programming. Developed by
JetBrains, PyCharm Community Edition provides a range of features and tools to
help Python developers write, debug, and test their code more efficiently.

Key Features of PyCharm Community Edition:

• PyCharm supports code analysis, graphical debugger, integrated unit tester,


integration with version control systems (VCS’s), and web development
with Django and Data Science with Anaconda.
• PyCharm is compatible with Windows, Linux, and macOS platforms and
supports both Python 2 (2.7) and Python 3 (3.5 and above).
• The Community Edition is open-source software available for download,
• PyCharm is widely used for Python programming due to its intelligent code
editor, which improves the readability of code using various colour
schemes and error highlighting.
• Code Editing
• Debugging
• Testing
• Cross-Platform Support

• Version Control Integration

• Database Tools

• Extensibility

Figure 3.4 PyCharm Logo

23
3.3.1.5 OpenCV Library:

OpenCV (Open-Source Computer Vision Library) is a powerful open-source


computer vision and machine learning software library. It provides a wide range of
functions for image and video processing, including object detection, facial
recognition, and augmented reality. OpenCV is written in C++ and has bindings for
Python, making it accessible and easy to use for developers working on projects
that require computer vision capabilities.

How to use OpenCV in PyCharm, you first need to install the library. You can do
this using pip, the Python package manager, by running the following command in
your terminal or command prompt:

[pip install OpenCV-python]

Once OpenCV is installed, you can start using it in your PyCharm project.

Figure 3.5 OpenCV Logo


This is a simple example, but OpenCV offers a wide range of functions for more
complex image and video processing tasks. You can find detailed documentation
and examples on the OpenCV website to help you get started with using the library
in your PyCharm projects.

23
3.3.1.6 Pyfirmata Library:

Pyfirmata is a Python library that enables communication between a Python script


running on a computer and an Arduino board. It allows you to control the Arduino's
digital and analog pins from your Python code, making it easier to create interactive
projects and prototypes. Pyfirmata works by uploading a special firmware to the
Arduino board that allows it to communicate over the serial port with the Python
script.

• How to Use Pyfirmata: Using Pyfirmata is relatively simple. First, you need
to install the library using pip:

[pip install pyfirmata]

• Next, you need to upload the StandardFirmata sketch to your Arduino board.
• This sketch can be found in the Arduino IDE under File > Examples >
Firmata > StandardFirmata.
• Upload this sketch to your Arduino board using the Arduino IDE.
• Once the sketch is uploaded, you can use Pyfirmata in your Python code to
communicate with the Arduino board.

Here’s why Pyfirmata is Used:

Pyfirmata is used to control Arduino boards from Python scripts. It provides a


convenient way to interact with Arduino hardware without having to write complex
C/C++ code. This makes it ideal for rapid prototyping and developing interactive
projects where real-time control of hardware is required.

23
3.3.1.7 Haarcascade_frontalface_default XML File:
The `haarcascade_frontalface_default.xml` file is a pre-trained XML file used with
OpenCV for face detection. You can download this file from the OpenCV GitHub
repository or from the OpenCV source files.
Here's how you can use it in your Python code with OpenCV:

• Download the XML File:


• You can download the `haarcascade_frontalface_default.xml` file from the
OpenCV GitHub repository or from the OpenCV source files. For example,
you can download it from the GitHub repository using the following link:
[haarcascade_frontalface_default.xml]
• Link:
https://github.com/opencv/opencv/blob/master/data/haarcascades/haarcasc
ade_frontalface_default.xml
• Place the XML File in Your Project Directory:
• Once you have downloaded the `haarcascade_frontalface_default.xml` file,
place it in the same directory as your Python script or in a directory that is
accessible to your Python script.

The “haarcascade_frontalface_default.xml” file is loaded using


“cv2.CascadeClassifier('haarcascade_frontalface_default.xml')”.

This file contains the trained data for detecting frontal faces. We can experiment
with different parameters of the “detect Multiscale” function to adjust the
sensitivity and accuracy of the face detection algorithm.

23
3.3.1.8 Circuit Diagram:

A circuit diagram was created to illustrate the connections between the 2-channel
relay module, microcontroller, and other components used in the project. The
diagram provides a visual representation of how the system is wired and helps in
understanding the overall setup of the home automation system.

Figure 3.6 Circuit Diagram of Home Automation _AI with all the Interfacing Module’s

23
3.3.1.9 Programming:

Python Code:

import pyfirmata
import cv2
video =cv2.VideoCapture(0) # for using external Webcam make 0 to 1
eye_cascade=cv2.CascadeClassifier(cv2.data.haarcascades +
'haarcascade_frontalface_default.xml')
pin=11
pin1=12
port=input("enter port")
com="COM"+port
print(com)
board=pyfirmata.Arduino(com)
while True :
x,videoread=video.read()

if not x:
break
gray=cv2.cvtColor(videoread,cv2.COLOR_BGR2GRAY)

eye=eye_cascade.detectMultiScale(gray,1.3,5,minSize=(30,30))
if (len(eye)>0):
for (x,y,w,h) in eye :
c = x+w//2 ,y+h//2
cv2.circle(videoread,c ,int(w/2 ),(0,255,255), 5)
x= "Face detected"
board.digital[11].write(0)
board.digital[12].write(1)

else:
x= "Face not detected"
board.digital[11].write(1)

23
board.digital[12].write(0)
cv2.putText(videoread,x,(10,30), cv2.FONT_HERSHEY_SIMPLEX ,1
,(255,0,0) , 4)
cv2.imshow("Home_Automation_with_AI", videoread)
if cv2.waitKey(1) & 0xFF == ord('e'): # To end the program press "e" in
keyboard once
break
board.digital[11].write(0)
board.digital[12].write(0)
cv2.destroyAllWindows()

23
3.4. DEMONSTRATION

Figure 3.7 Home Automation _AI Face Not Detected

Figure 3.8 Home Automation _AI Face Detected

23
3.5. CONCLUSION
The integration of Arduino, OpenCV, and Pyfirmata in this project has
successfully demonstrated the potential for creating a smart home automation
system with AI face detection. By combining these technologies, we have created
a system that can detect faces in real-time and trigger actions such as controlling
AC loads. This project serves as a proof of concept for utilizing these technologies
in smart home applications.

The use of Arduino as the central controller provides a flexible and scalable
platform for expanding the system's functionality. OpenCV's face detection
algorithm has proven to be robust and reliable, allowing for accurate detection of
faces in various conditions. Pyfirmata has simplified the communication between
Python and Arduino, enabling seamless integration of the two technologies.

3.5.1. Future Enhancements:

One potential future enhancement for this system is the implementation of sleep
detection for drivers. By utilizing additional sensors, such as eye-tracking or heart
rate monitors, the system could detect signs of drowsiness in drivers and trigger
alerts to prevent accidents. Additionally, the system could be integrated with a
braking system to automatically apply brakes in case of detected drowsiness,
further enhancing the safety features of the system.

Overall, this project lays the foundation for further research and development in
the field of smart home technology and AI-based automation systems. With
continued advancements in technology, we can expect to see more innovative
solutions for improving the efficiency, comfort, and safety of our homes and daily
lives.

23
Chapter 4

USE CASE-1 AND USE CASE-2

4.1. Use case-1 Flow Diagram: (User)

Figure 4.1 Use case-1 flow diagram

4.1.1. Explanation of the flow diagram:

The flow diagram provided are outlines the process of a User for using this
Two_Motor_Robot App created with MIT App Inventor. Here’s the explanation
in points:

10
• Installation: The user begins by installing the Two_Motor_Robot
application on their device.
• Pair with Bluetooth: Next, the user pairs their device with the
Two_Motor_Robot system via Bluetooth.
• Connect with Bluetooth: Once paired, the user connects to the
Two_Motor_Robot system using Bluetooth.
• Control Robot: The user can then control various Movements such as:
o FORWARD: It’s move forward.
o BACKWARD: It’s move backward
o RIGHT: It’s move right side.
o LEFT: It’s move left side.
o STOP: It will stop.

This flowchart represents a user-friendly interface that allows for easy control of
Two_Motor_Robot through a mobile application. It simplifies the process of
managing Robot remotely.
4.2. Use case-2 Flow Diagram: (User)

Figure 4.2 Use case-2 flow diagram


4.1.1. Explanation of the flow diagram:

The flow diagram provided are outlines the process of a User for using this Home
Automation project using Arduino and AI Face detection. Here’s the explanation
in points:

• Open PyCharm: The user initiates the process by opening PyCharm


Community Edition 2023.2.5.
• Run the Code: They execute the Python code that interfaces with the
Arduino Uno board via Pyfirmata and incorporates AI face detection using
OpenCV and the haarcascade_frontalface_default XML file.
• Enter COM Port: The user enters the COM port number that corresponds
to the Arduino Uno connected to the laptop/PC via a USB cable.
• Face Detection: Depending on whether a face is detected or not, the
system will control the lights:
▪ Face Detected: If a face is detected, Light 1 is turned ON.
▪ No Face Detected: If no face is detected, Light 2 is turned ON.

This flowchart outlines the user interaction with the software and the subsequent
automation response based on face detection results. The 2-channel relay module
is used to control AC loads, such as lights, based on the detection outcome.
REFERENCES

• MIT App Inventor. (n.d.). Retrieved from http://appinventor.mit.edu/


• Bluetooth HC-05 Module. (n.d.). Retrieved from
https://components101.com/wireless/hc-05-bluetooth-module
• 4-Channel Relay Module. (n.d.). Retrieved from
https://components101.com/relays/4-channel-relay-module
• Arduino Uno. (n.d.). Retrieved from
https://www.arduino.cc/en/Main/ArduinoBoardUno
• Roboma⭲ťka⭲ Yo"ľ"bc Home Automation Using Bluetooth App:
https://www.youtube.com/watch?v=z4UF630u87o
• OpenCV. (n.d.). OpenCV: Open-Source Computer Vision Library.
Retrieved from https://opencv.org/
• Roboma⭲ťka⭲ Yo"ľ"bc PyCharm Installation Tutorial:
https://www.youtube.com/watch?v=CawAHN_-Rw8
• PyCharm Community Edition. (n.d.). JetBrains. Retrieved from
https://www.jetbrains.com/pycharm/
• Arduino. (n.d.). Arduino - Home. Retrieved from https://www.arduino.cc/
• Pyfirmata. (n.d.). PyPI. Retrieved from https://pypi.org/project/pyFirmata/
• "Haarcascade_frontalface_default.xml." GitHub. Retrieved from
https://github.com/opencv/opencv/blob/master/data/haarcascades/haarcasc
ade_frontalface_default.xml
APPENDICES

“Using a Mobile Controlled ESP32 Two-Wheel Drive Robot”

Appendix A: Project Overview

Appendices for a mobile-controlled ESP32 two-wheel drive robot project might


include technical details, circuit diagrams, code snippets, and additional resources.
Here's a brief outline:

1. Technical Specifications:

Description of the ESP32 microcontroller.


Specifications of the two-wheel drive mechanism (motors, wheels,
chassis).
Power requirements (battery voltage, current).

2. Circuit Diagram:

Schematics detailing the connections between the ESP32, motor


driver, motors, and other components.
Pinout diagrams for the ESP32 board.

3. Code Snippets:

Key sections of the code for controlling the robot, including


initialization, motor control, and communication with the mobile app.
Explanation of how the code works and any important algorithms or
functions.

4. Mobile App Interface:

Screenshots or sketches of the mobile app interface used to control the


robot.

Description of the controls and their functionalities.


5. Bill of Materials (BOM):
List of all components used in the project, including part numbers and
quantities.

Cost estimates for each component.

6. Assembly Instructions:
• Step-by-step guide on assembling the hardware
components.
• Tips for troubleshooting common assembly
issues.
7. Testing and Calibration:
• Procedures for testing the robot's functionality.
• Instructions for calibrating motor speeds and
adjusting control parameters.
8. Additional Resources:
• Links to datasheets for components used.
• References to relevant tutorials, articles, or
forums for further learning.
• Any libraries or dependencies used in the project.
9. Safety Precautions:
• Recommendations for safe operation of the robot,
including handling of batteries and motors.
• Guidelines for using the mobile app in a
responsible manner.
10. Acknowledgements:
• Recognition of individuals or organizations
that contributed to the project, such as
mentors, sponsors, or collaborators.
Appendix B: Data Tables of Equipment Specifications

Equipment Description
ESP32 Development
Microcontroller board with built-in Wi-Fi and
Board Bluetooth capabilities.
Motor Driver H-bridge motor driver for controlling the two-wheel
drive mechanism
DC Motors Geared DC motors used for driving the wheels.
Wheels Rubber wheels for traction and smooth movement.
Battery Pack Lithium-ion battery pack for powering the robot.
Jumper Wires Wires for making electrical connections on the
breadboard or PCB.
Breadboard Prototyping board for temporarily connecting
electronic components.
Power Switch On/Off switch for controlling power to the robot.
USB Cable USB cable for programming and powering the ESP32
board.
Smartphone/Tablet Device running the mobile app for controlling the
robot.

This table provides a concise overview of the main equipment required for the
project, helping to ensure that all necessary components are accounted for
during planning and implementation.
Drawing: Circuit Diagram

Experimental Configuration:
The experimental setup consisted of the following components:

• ESP32
• Motor Driver
• Smartphone or tablet
• Power supply (9V)
• DC Motors
SMART HOME AUTOMATION SYSTEM WITH
FACE DETECTION USING ARDUINO UNO AND
OPENCV
Appendix A: Circuit Diagram

The following circuit diagram illustrates the setup for the home automation project
using Arduino, AI face detection with OpenCV, and Pyfirmata for communication
with the Arduino Uno board. The circuit includes the connection of a 2-channel
relay for controlling AC loads and the USB connection between the Arduino Uno
and a laptop/PC.
Appendix B: Python Code for Face Detection

Full Python code for face detection is in Chapter 3.

Appendix C: Arduino Sketch

Snippet of standard firmata code in Arduino IDE:

Refer to the highlighted files


Appendix D: System Components

- Arduino Uno
- 2-channel Relay
- USB Cable
- Laptop/PC
- AC Loads
- OpenCV Library
- Pyfirmata Library
- Haarcascade_frontalface_default XML file

Appendix E: Future Enhancements

One potential future enhancement for this system is the implementation of sleep
detection for drivers. By utilizing additional sensors, such as eye-tracking or heart
rate monitors, the system could detect signs of drowsiness in drivers and trigger
alerts to prevent accidents. Additionally, the system could be integrated with a
braking system to automatically apply brakes in case of detected drowsiness, further
enhancing the safety features of the system.

You might also like