Artificial Intelligence

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Artificial intelligence (AI)

It makes it possible for machines to learn from experience, adjust to new


inputs and perform human-like tasks. Most AI examples that you hear about
today – from chess-playing computers to self-driving cars – rely heavily on
deep learning and  natural language processing. Using these technologies,
computers can be trained to accomplish specific tasks by processing large
amounts of data and recognizing patterns in the data.

Artificial Intelligence History


The term artificial intelligence was coined in 1956, but AI has become more
popular today thanks to increased data volumes, advanced algorithms, and
improvements in computing power and storage.

Early AI research in the 1950s explored topics like problem solving and
symbolic methods. In the 1960s, the US Department of Defense took interest
in this type of work and began training computers to mimic basic human
reasoning. For example, the Defense Advanced Research Projects Agency
(DARPA) completed street mapping projects in the 1970s. And DARPA
produced intelligent personal assistants in 2003, long before Siri, Alexa or
Cortana were household names.

This early work paved the way for the automation and formal reasoning that
we see in computers today, including decision support systems and smart
search systems that can be designed to complement and augment human
abilities.

While Hollywood movies and science fiction novels depict AI as human-like


robots that take over the world, the current evolution of AI technologies isn’t
that scary – or quite that smart. Instead, AI has evolved to provide many
specific benefits in every industry. Keep reading for modern examples of
artificial intelligence in health care, retail and more.

1950s–1970s
Neural Networks
Early work with neural networks stirs excitement for “thinking machines.”
1980s–2010s
Machine Learning
Machine learning becomes popular.

Present Day
Deep Learning
Deep learning breakthroughs drive AI boom.
 

Why is artificial intelligence important?

AI automates repetitive learning and discovery through data. Instead of


automating manual tasks, AI performs frequent, high-volume, computerized
tasks. And it does so reliably and without fatigue. Of course, humans are still
essential to set up the system and ask the right questions.
AI adds intelligence to existing products. Many products you already use will
be improved with AI capabilities, much like Siri was added as a feature to a
new generation of Apple products. Automation, conversational platforms, bots
and smart machines can be combined with large amounts of data to improve
many technologies. Upgrades at home and in the workplace, range from
security intelligence and smart cams to investment analysis.

AI adapts through progressive learning algorithms to let the data do the


programming. AI finds structure and regularities in data so that algorithms can
acquire skills. Just as an algorithm can teach itself to play chess, it can teach
itself what product to recommend next online. And the models adapt when
given new data. 

AI analyzes more and deeper data using neural networks that have many
hidden layers. Building a fraud detection system with five hidden layers used to
be impossible. All that has changed with incredible computer power and big
data. You need lots of data to train deep learning models because they learn
directly from the data. 

AI achieves incredible accuracy through deep neural networks. For example,


your interactions with Alexa and Google are all based on deep learning. And
these products keep getting more accurate the more you use them. In the
medical field, AI techniques from deep learning and object recognition can
now be used to pinpoint cancer on medical images with improved accuracy.

AI gets the most out of data. When algorithms are self-learning, the data itself
is an asset. The answers are in the data. You just have to apply AI to find them.
Since the role of the data is now more important than ever, it can create a
competitive advantage. If you have the best data in a competitive industry,
even if everyone is applying similar techniques, the best data will win.

Health Care

AI applications can provide personalized medicine and X-ray readings. Personal


health care assistants can act as life coaches, reminding you to take your pills,
exercise or eat healthier.
Retail

AI provides virtual shopping capabilities that offer personalized


recommendations and discuss purchase options with the consumer. Stock
management and site layout technologies will also be improved with AI.

Manufacturing

AI can analyze factory IoT data as it streams from connected equipment to


forecast expected load and demand using recurrent networks, a specific type
of deep learning network used with sequence data.

Banking

Artificial Intelligence enhances the speed, precision and effectiveness of


human efforts. In financial institutions, AI techniques can be used to identify
which transactions are likely to be fraudulent, adopt fast and accurate credit
scoring, as well as automate manually intense data management tasks.

How Artificial Intelligence Works


AI works by combining large amounts of data with fast, iterative processing
and intelligent algorithms, allowing the software to learn automatically from
patterns or features in the data. AI is a broad field of study that includes many
theories, methods and technologies, as well as the following major subfields:

Machine Learning

Machine learning automates analytical model building. It uses methods from


neural networks, statistics, operations research and physics to find hidden
insights in data without explicitly being programmed for where to look or what
to conclude.

Neural Networks

A neural network is a type of machine learning that is made up of


interconnected units (like neurons) that processes information by responding
to external inputs, relaying information between each unit. The process
requires multiple passes at the data to find connections and derive meaning
from undefined data.

Deep Learning

Deep learning uses huge neural networks with many layers of processing units,
taking advantage of advances in computing power and improved training
techniques to learn complex patterns in large amounts of data. Common
applications include image and speech recognition.

Additionally, several technologies enable and support AI:

Computer vision relies on pattern recognition and deep learning to recognize


what’s in a picture or video. When machines can process, analyze and
understand images, they can capture images or videos in real time and
interpret their surroundings.

Natural language processing (NLP) is the ability of computers to analyze,


understand and generate human language, including speech. The next stage of
NLP is natural language interaction, which allows humans to communicate with
computers using normal, everyday language to perform tasks.

 Graphical processing units are key to AI because they provide the heavy
compute power that’s required for iterative processing. Training neural
networks requires big data plus compute power.

The Internet of Things generates massive amounts of data from connected


devices, most of it unanalyzed. Automating models with AI will allow us to use
more of it.

Advanced algorithms are being developed and combined in new ways to


analyze more data faster and at multiple levels. This intelligent processing is
key to identifying and predicting rare events, understanding complex systems
and optimizing unique scenarios.
APIs, or application programming interfaces, are portable packages of code
that make it possible to add AI functionality to existing products and software
packages. They can add image recognition capabilities to home security
systems and Q&A capabilities that describe data, create captions and
headlines, or call out interesting patterns and insights in data.

In summary, the goal of AI is to provide software that can reason on input and
explain on output. AI will provide human-like interactions with software and
offer decision support for specific tasks, but it’s not a replacement for humans
– and won’t be anytime soon. 

You might also like