What Is Ai Technology
What Is Ai Technology
What Is Ai Technology
interchangeably, especially by companies in their marketing materials. But there are distinctions. The
term AI, coined in the 1950s, refers to the simulation of human intelligence by machines. It covers an
ever-changing set of capabilities as new technologies are developed. Technologies that come under the
Machine learning enables software applications to become more accurate at predicting outcomes without
being explicitly programmed to do so. Machine learning algorithms use historical data as input to predict
new output values. This approach became vastly more effective with the rise of large data sets to train on.
Deep learning, a subset of machine learning, is based on our understanding of how the brain is
structured. Deep learning's use of artificial neural networks structure is the underpinning of recent
DOWNLOAD1
While the huge volume of data created on a daily basis would bury a human
researcher, AI applications using machine learning can take that data and
quickly turn it into actionable information. As of this writing, a primary
disadvantage of AI is that it is expensive to process the large amounts of data
AI programming requires. As AI techniques are incorporated into more
products and services, organizations must also be attuned to AI's potential to
create biased and discriminatory systems, intentionally or inadvertently.
Advantages of AI
The following are some advantages of AI.
• Expensive.
DAVID PETERSSON
Machine vision. This technology gives a machine the ability to see. Machine
vision captures and analyzes visual information using a camera, analog-to-
digital conversion and digital signal processing. It is often compared to human
eyesight, but machine vision isn't bound by biology and can be programmed
to see through walls, for example. It is used in a range of applications from
signature identification to medical image analysis. Computer vision, which is
focused on machine-based image processing, is often conflated with machine
vision.
Natural language processing (NLP). This is the processing of human
language by a computer program. One of the older and best-known examples
of NLP is spam detection, which looks at the subject line and text of an email
and decides if it's junk. Current approaches to NLP are based on machine
learning. NLP tasks include text translation, sentiment analysis and speech
recognition.
Security. AI and machine learning are at the top of the buzzword list security
vendors use to market their products, so buyers should approach with caution.
Still, AI techniques are being successfully applied to multiple aspects of
cybersecurity, including anomaly detection, solving the false-positive
problem and conducting behavioral threat analytics. Organizations use
machine learning in security information and event management (SIEM)
software and related areas to detect anomalies and identify suspicious
activities that indicate threats. By analyzing data and using logic to identify
similarities to known malicious code, AI can provide alerts to new and
emerging attacks much sooner than human employees and previous
technology iterations.
Policymakers in the U.S. have yet to issue AI legislation, but that could
change soon. A "Blueprint for an AI Bill of Rights" published in October 2022
by the White House Office of Science and Technology Policy (OSTP) guides
businesses on how to implement ethical AI systems. The U.S. Chamber of
Commerce also called for AI regulations in a report released in March 2023.
The late 19th and first half of the 20th centuries brought forth the foundational
work that would give rise to the modern computer. In 1836, Cambridge
University mathematician Charles Babbage and Augusta Ada King, Countess
of Lovelace, invented the first design for a programmable machine.
1950s. With the advent of modern computers, scientists could test their ideas
about machine intelligence. One method for determining whether a computer
has intelligence was devised by the British mathematician and World War II
code-breaker Alan Turing. The Turing test focused on a computer's ability to
fool interrogators into believing its responses to their questions were made by
a human being.
1956. The modern field of artificial intelligence is widely cited as starting this
year during a summer conference at Dartmouth College. Sponsored by the
Defense Advanced Research Projects Agency (DARPA), the conference was
attended by 10 luminaries in the field, including AI pioneers Marvin Minsky,
Oliver Selfridge and John McCarthy, who is credited with coining the
term artificial intelligence. Also in attendance were Allen Newell, a computer
scientist, and Herbert A. Simon, an economist, political scientist and cognitive
psychologist. The two presented their groundbreaking Logic Theorist, a
computer program capable of proving certain mathematical theorems and
referred to as the first AI program.
1950s and 1960s. In the wake of the Dartmouth College conference, leaders
in the fledgling field of AI predicted that a man-made intelligence equivalent to
the human brain was around the corner, attracting major government and
industry support. Indeed, nearly 20 years of well-funded basic research
generated significant advances in AI: For example, in the late 1950s, Newell
and Simon published the General Problem Solver (GPS) algorithm, which fell
short of solving complex problems but laid the foundations for developing
more sophisticated cognitive architectures; and McCarthy developed Lisp, a
language for AI programming still used today. In the mid-1960s, MIT
Professor Joseph Weizenbaum developed ELIZA, an early NLP program that
laid the foundation for today's chatbots.
2010s. The decade between 2010 and 2020 saw a steady stream of AI
developments. These include the launch of Apple's Siri and Amazon's Alexa
voice assistants; IBM Watson's victories on Jeopardy; self-driving cars; the
development of the first generative adversarial network; the launch of
TensorFlow, Google's open source deep learning framework; the founding of
research lab OpenAI, developers of the GPT-3 language model and Dall-E
image generator; the defeat of world Go champion Lee Sedol by Google
DeepMind's AlphaGo; and the implementation of AI-based systems that detect
cancers with a high degree of accuracy.
2020s. The current decade has seen the advent of generative AI, a type of
artificial intelligence technology that can produce new content. Generative AI
starts with a prompt that could be in the form of a text, an image, a video, a
design, musical notes or any input that the AI system can process. Various AI
algorithms then return new content in response to the prompt. Content can
include essays, solutions to problems, or realistic fakes created from pictures
or audio of a person. The abilities of language models such as ChatGPT-3,
Google's Bard and Microsoft's Megatron-Turing NLG have wowed the world,
but the technology is still in early stages, as evidenced by its tendency to
hallucinate or skew answers.
Over the last several years, the symbiotic relationship between AI discoveries
at Google, Microsoft, and OpenAI, and the hardware innovations pioneered by
Nvidia have enabled running ever-larger AI models on more connected GPUs,
driving game-changing improvements in performance and scalability.
The collaboration among these AI luminaries was crucial for the recent
success of ChatGPT, not to mention dozens of other breakout AI services.
Here is a rundown of important innovations in AI tools and services.
Transformers. Google, for example, led the way in finding a more efficient
process for provisioning AI training across a large cluster of commodity PCs
with GPUs. This paved the way for the discovery of transformers that
automate many aspects of training AI on unlabeled data.
• Don't count out Google and its AI chatbot Bard from AI race
Related Terms
Auto-GPT
Auto-GPT is an experimental, open source autonomous AI agent based on the GPT-4
language model. See complete definition
unsupervised learning
Unsupervised learning is a type of machine learning (ML) technique that uses artificial
intelligence (AI) algorithms to identify ... See complete definition