Holographic Data Storage System - Seminar Report
Holographic Data Storage System - Seminar Report
Holographic Data Storage System - Seminar Report
A
Seminar Report on
Session
(2009-2010)
Submitted To Submitted By
Mr. Naveen Hemrajani Anoop Nair
Head of Department Roll No. - 10
Computer Science Computer Science
Computer Science Engineering
Gyan Vihar School of Engineering and Technology
Jagatpura, Jaipur
Certificate
This is to certify that the Final Seminar Report entitled “Holographic Data Storage
System” is an authentic record of the work carried out by Anoop Nair, Computer
Science, VIII Semester. The report is submitted to Gyan Vihar School of
Engineering and Technology, Jaipur, India, in partial fulfillment of the requirements
for the award of the degree Bachelor of Technology in Computer Science
Engineering during the academic year 2009-2010, VIII Semester.
It is further certified that the work embodied in the report has not been submitted to
any other University or Institute for the award of any degree or diploma.
Naveen Hemrajani
Head of Department
(Computer Science)
G.V.S.E.T
1. Abstract............................................................................8
2. Introduction....................................................................9
3. Technical Aspect...........................................................14
Holography Memory layout..................................................................16
5. Holograms..................................................................23
Volume Holograms..................................................................................................23
7. Working.....................................................................31
8. Application to Binary...............................................34
Spatial Light Modulator...........................................................................................34
Page Data Access.....................................................................................................34
9. Application to Binary...............................................34
Angular Multiplexing...............................................................................................36
Wavelength Multiplexing.........................................................................................37
Spatial Multiplexing..................................................................................................37
Phase-Encoded Multiplexing...................................................................................37
Combining Multiplexing Methods...........................................................................38
10.Error Correction.....................................................38
Recording Errors....................................................................................................38
Page-Level Parity Bits.............................................................................................39
Smart Interfacing....................................................................................................40
Intelligent Interfacing.............................................................................................40
11. Implementation.................................................................41
12. Holographic Memory Vs Existing Memory Technology......43
18. Outlook...................................................................................86
19.Application...............................................................88
Holographic Versatile Disv...................................................................................88
21. Comparison...............................................................98
22.HVD at a glance.....................................................99
23.References................................................................101
A dichroic minor layer between the holographic data and the servo data reflects the blue-green
laser while letting the red laser pass through. This prevents interference from refraction of the
blue-green laser off the servo data pits and is an advance over past holographic storage media,
which either experienced too much interference, or lacked the servo data entirely, making them
incompatible with current CD and DVD drive technology. These discs have the capacity to hold
up to 3.9 terabyte(TB) of information, which is approximately 6,000 times the capacity of a CD-
ROM, 830 times the capacity of a DVD, 160 times the capacity of single-layer Blu-ray Discs, and
about 8 times the capacity of standard computer hard drives as of 2006. The HVD also has a
transfer rate of 1 gigabit/s. Optware has released a 200 GB disc in early June 2006 and Maxell in
September 2006 with a capacity of 300 GB and transfer rate of 20 MB/s.
Devices that use light to store and read data have been the backbone of data storage for nearly two
decades. Compact discs revolutionized data storage in the early 1980s, allowing multi-megabytes
of data to be stored on a disc that has a diameter of a mere 12 centimeters and a thickness of about
1.2 millimeters. In 1997, an improved version of the CD, called a digital versatile disc (DVD),
was released, which enabled the storage of full-length movies on a single disc.
CDs and DVDs are the primary data storage methods for music, software, personal computing and
video. A CD can hold 783 megabytes of data. A double-sided, double-layer DVD can hold 15.9
GB of data, which is about eight hours of movies. These conventional storage mediums meet
today's storage needs, but storage technologies have to evolve to keep pace with increasing
consumer demand. CDs, DVDs and magnetic storage all store bits of information on the surface of
a recording medium. In order to increase storage capabilities, scientists are now working on a new
optical storage method called holographic memory that will go beneath the surface and use the
volume of the recording medium for storage, instead of only the surface area. Three-dimensional
data storage will be able to store more information in a smaller space and offer faster data transfer
times.
Holographic memory is developing technology that has promised to revolutionalise the storage
systems. It can store data upto 1 Tb in a sugar cube sized crystal. Data from more than 1000 CDs
can fit into a holographic memory System. Most of the computer hard drives available today can
hold only 10 to 40 GB of data, a small fraction of what holographic memory system can hold.
Conventional memories use only the surface to store the data. But holographic data storage
systems use the volume to store data. It has more advantages than conventional storage systems. It
is based on the principle of holography. However, both magnetic and conventional optical data
storage technologies, where individual bits are stored as distinct magnetic or optical changes on
the surface of a recording medium, are approaching physical limits beyond which individual bits
may be too small or too difficult to store. Storing information throughout the volume of a medium
—not just on its surface— offers an intriguing high-capacity alternative. Holographic data storage
is a volumetric approach which, although conceived decades ago, has made recent progress
toward practicality with the appearance of lower-cost enabling technologies, significant results
from longstanding research efforts, and progress in holographic recording materials.
A large number of these interference gratings or patterns can be superimposed in the same thick
piece of media and can be accessed independently, as long as they are distinguishable by the
direction or the spacing of the gratings. Such separation can be accomplished by changing the
angle between the object and reference wave or by changing the laser wavelength. Any particular
data page can then be read out independently by illuminating the stored gratings with the reference
wave that was used to store that page. Because of the thickness of the hologram, this reference
wave is diffracted by the interference patterns in such a fashion that only the desired object beam
is significantly reconstructed and imaged on an electronic camera. The theoretical limits for the
storage density of this technique are around tens of terabits per cubic centimeter.
Figure 2:
Reading of holographic information by (a)
Illumination with the reference beam, which is
diffracted by the stored interference pattern to
Figure 1:
reconstruct the original spherical wavefront of the
Storage of one bit of information as a hologram: (a)
object beam. This beam can be imaged to a single
Superposition of the spherical wave from one bit with a
small detector, resulting in the retrieval of a single
coherent plane wave reference beam forming an bit. (b) Illumination with the diverging object beam,
interference pattern. (b) Exposure of a photosensitive which is diffracted by the stored interference beam.
medium to the interference pattern. (c) Record of the This beam can be focused to a detector, representing
interference grating, stored as changes in the refractive an optical measurement of the correlation between
properties of the medium. the stored data and the illuminating object beam,
allowing content-addressable searching. (c)
Illumination with a counter-propagating (or “phase-
conjugate”) reference beam, which is diffracted by
the stored interference pattern to reconstruct a phase-
conjugate copy of the original beam. This phase-
conjugate object beam returns to its original point of
origin, where the stored bit value can be read without
requiring a high-quality imaging system.
The data to be stored are imprinted onto the object beam with a pixelated input device called a
spatial light modulator (SLM); typically, this is a liquid crystal panel similar to those on laptop
computers or in modern camcorder viewfinders. To retrieve data without error, the objectbeam
must contain a high-quality imaging system—one capable of directing this complex optical
wavefront through the recording medium, where the wavefront is stored and then later retrieved,
and then onto a pixelated camera chip (Figure 3).
Figure 3:
Basic holographic data system. Data are imprinted onto
the object beam with a pixelated input device called a
spatial light modulator (SLM). A pair of lenses images
the data through the storage material onto pixelated
detector array such as a charge-coupled device (CCD).
A reference beam intersects the object beam in the
storage material, allowing the storage and later
retrieval of holograms
When the information is to be retrieved or read out from the hologram, only the reference beam is
necessary. The beam is sent into the material in exactly the same way as when the hologram was
written. As a result of the index changes in the material that were created during writing, the beam
splits into two parts. One of these parts recreates the signal beam where the information is stored.
Something like a CCD camera can be used to convert this information into a more usable form.
Holograms can theoretically store one bit per cubic block the size of the wavelength of light in
writing. For example, light from a helium-neon laser is red, 632.8 nm wavelength light. Using
light of this wavelength, perfect holographic storage could store 4 gigabits per cubic millimeter. In
practice, the data density would be much lower, for at least four reasons:
Floppy Disk
Floppy disk drives provide faster data access because they access data randomly. Floppy drives provide an
average data access speed of less than 100 milliseconds (ms). The 1.44-MB, 3.5-inch floppy is useful for
storing and backing up small data files, can be used to boot computer systems, and has been the standard
for data interchange between PCs. However it provides only a fraction of the storage capacity required for
many files and most software programs in use today. Storing data on floppy drives also is slow. Data
transfer rates average around 0.06 MB/sec.
Floppy disk
Optical Formats
Optical RMSD formats use a laser light source to read and/or write digital data to disc. CD and
DVD are two major optical formats. CDs and DVDs have similar compositions consisting of a
label, a protective layer, a reflective layer (aluminum, silver, or gold), a digital-data layer molded
in polycarbonate, and a thick polycarbonate bottom layer
CD formats include:
CD-ROM
CD-ROM Standard was established in 1984.They quickly evolved into a low cost digital storage
option because of CD-audio industry Data bits are permanently stored on a CD as a spiral track of
physically molded pits in the surface of a plastic data layer that is coated with reflective
aluminum. Smooth areas surrounding pits are called lands. CDs are extremely durable because the
optical pickup (laser light source, lenses and optical elements, photoelectric sensors, and
amplifiers) never touches the disc. Because data is read through the thick bottom layer, most
scratches and dust on the disc surface are out of focus, so they do not interfere with the reading
process. One CD-ROM (650-700 MB) storage capacity can store data from more than 450 floppy
disks. Data access rate ranges from 80 to 120 ms. Data transfer rates are approximately 6 MB/sec.
DVD-ROM
The DVD-ROM standard, introduced in 1995 came over as a result of a DVD consortium. Like
CD drives, DVD drives read data through the disc substrate reducing interferences from surface
dust and scratches. However DVD-ROM technology provides seven times the storage capacity of
CDs and accomplishes most of this increase by advancing the technology used for CD systems.
The distance between recording tracks is less than half that is used for CDs. The pit size also is
less than half that of CDs, which requires a reduced laser wavelength read the smaller sized pits.
These features alone give DVD-ROM discs 4.5 times the storage capacity of CDs. DVD drives
can also store on both sides of the disc; manufacturers deliver the two-sided structure by bonding
two thinner substrates together, providing the potential to double a DVD's storage capacity. Single
sided DVD discs have the two fused substrates, but only one side contains data.
DVD-R drives were introduced in 1997 to provide write-once capability on DVD-R discs used for
producing disc masters in software development and for multimedia post-production. This
technology sometimes referred to as DVD-R for authoring, is limited to niche applications
because drives and media are expensive. DVD-R discs employ a photosensitive dye technology
similar to CD-R media. At 3.95 GB per side, the first DVD-R discs provided a little less storage
capacity than DVD-ROM discs. That capacity has now been extended to the 4.7-GB capacity of
DVD-ROM discs. The 1X DVD-R data transfer rate is 1.3 MB/sec. Most DVD-ROM drives and
DVD video players read DVD-R discs. Slightly modified DVD-R drives and discs have recently
become available for general use.
DVD-RAM
+RW
Sony and Philips were founding members of the DVD consortium, but broke away to introduce
the DVD+RW (now called +RW) phase change, rewritable technology in 1997. Discs can be
written approximately 1000 times, which makes them a good option for video recording, but not
optimal for data storage. +RW technology's strongest feature is its backward compatibility with
DVD-ROM drives and DVD video players.
Magneto-Optical Formats
Magneto-optical (MO) technology combines the strengths of magnetic and optical technologies by
using a laser to read data and the combination of a laser and magnetic field to write data. The top
(label side) of the disk is exposed to a magnetic field to write data, and a laser light source targets
the data layer through the bottom substrate to read data.
There are 3.5- and 5.5-inch disk formats that contain a magnetic alloy layer. Magnetic particles in
the alloy are very stable and resist changing polarity at room temperature. Data bits re recorded on
this magnetic layer by heating it with a focused laser beam in the presence of magnetic field.
Changes in the magnetic orientation of the data bits along a track represents Os and I s much like
on hard disks and other magnetic media. The magnetic layer also changes the rotation or
polarization of reflected laser light depending on the 0 or 1 polarity of the magnetic bits. This
property called the "Kerr Effect" and is used to read the data. MO systems also increase the data
bits vertically rather than horizontally.
Magneto-Optical disk
Volume Holograms
To make the hologram, the reference and object beams are overlapped in a photosensitive
medium, such as a photopolymer or inorganic crystal. The resulting optical interference pattern
creates chemical and/or physical changes in the absorption, refractive index or thickness of the
storage media, preserving a replica of the illuminating interference pattern. Since this pattern
contains information about both the amplitude and the phase of the two light beams, when the
recording is illuminated by the readout beam, some of the light is diffracted to “reconstruct” a
weak copy of the object beam .If the object beam originally came from a 3–D object, then the
reconstructed hologram makes the 3–D object reappear. Since the diffracted wave front
accumulates energy from throughout the thickness of the storage material, a small change in either
the wavelength or angle of the readout beam generates enough destructive interference to make
the hologram effectively disappear through Bragg selectivity.
As the material becomes thicker, accessing a stored volume hologram requires tight tolerances on
the stability and repeatability of the wavelength and incidence angle provided by the laser and
readout optics. However, destructive interference also opens up a tremendous opportunity: a small
storage volume can now store multiple superimposed holograms, each one distributed throughout
the entire volume. The destructive interference allows each of these stored holograms to be
independently accessed with its original reference beam. To record a second, angularly
multiplexed hologram, for instance, the angle of the reference beam is changed sufficiently so that
the reconstruction of the first hologram effectively disappears. The new incidence angle is used to
record a second hologram with a new object beam. The two holograms can be independently
accessed by changing the readout laser beam angle back and forth. For a 2-cm hologram
thickness, the angular sensitivity is only 0.0015 degrees. Therefore, it becomes possible to store
thousands of holograms within the allowable range of reference arm angles (typically 20–30
degrees). The maximum number of holograms stored at a single location to date is 10,000.
HOLOGRAPHY
Holographic data storage refers specifically to the use of holography to store and retrieve digital
data. To do this, digital data must be imposed onto an optical wave front, stored holographically
with high volumetric density, and then extracted from the retrieved optical wav front with
excellent data fidelity. A hologram preserves both the phase and amplitude of an optical wave
front of interest called the object beam – by recording the optical interference pattern between it
and a second coherent optical beam – the reference beam. Figure shows this process.
The reference beam is designed to be simple to reproduce at a later stage (A common reference
beam is a plane wave a light beam that propagates without converging or diverging). These
interference fringes are recorded if the two beams have been overlapped within a suitable
photosensitive media, such as a photopolymer or inorganic crystal or photographic film. The
bright and dark variations of the interference pattern create chemical and/or physical changes in
Plane wavefronts
A diffraction grating is a structure with a repeating pattern. A simple example is a metal plate with
slits cut at regular intervals. Light rays travelling through it are bent at an angle determined by λ,
the wavelength of the light and d, the distance between the slits and is given by sinθ = λ/d.
A very simple hologram can be made by superimposing two plane waves from the same light
source. One (the reference beam) hits the photographic plate normally and the other one (the
object beam) hits the plate at an angle θ. The relative phase between the two beams varies across
the photographic plate as 2π y sinθ/λ where y is the distance along the photographic plate. The
two beams interfere with one another to form an interference pattern. The relative phase changes
by 2π at intervals of d = λ/sinθ so the spacing of the interference fringes is given by d. Thus, the
relative phase of object and reference beam is encoded as the maxima and minima of the fringe
pattern.
When the photographic plate is developed, the fringe pattern acts as a diffraction grating and when
the reference beam is incident upon the photographic plate, it is partly diffracted into the same
Computer Science Department Holographic Data Storage System (HDSS) 24
angle θ at which the original object beam was incident. Thus, the object beam has been
reconstructed. The diffraction grating created by the two waves interfering has reconstructed the
"object beam" and it is therefore a hologram as defined above.
Point sources
A slightly more complicated hologram can be made using a point source of light as object beam
and a plane wave as reference beam to illuminate the photographic plate. An interference pattern
is formed which in this case is in the form of curves of decreasing separation with increasing
distance from the centre.
The photographic plate is developed giving a complicated pattern which can be considered to be
made up of a diffraction pattern of varying spacing. When the plate is illuminated by the reference
beam alone, it is diffracted by the grating into different angles which depend on the local spacing
of the pattern on the plate. It can be shown that the net effect of this is to reconstruct the object
beam, so that it appears that light is coming from a point source behind the plate, even when the
source has been removed. The light emerging from the photographic plate is identical to the light
that emerged from the point source that used to be there. An observer looking into the plate from
the other side will "see" a point source of light whether the original source of light is there or not.
This sort of hologram is effectively a concave lens, since it "converts" a plane wavefront into a
divergent wavefront. It will also increase the divergence of any wave which is incident on it in
Complex objects
To record a hologram of a complex object, a laser beam is first split into two separate beams of
light using a beam splitter of half-silvered glass or a birefringent material. One beam illuminates
the object, reflecting its image onto the recording medium as it scatters the beam. The second
(reference) beam illuminates the recording medium directly.
According to diffraction theory, each point in the object acts as a point source of light. Each of
these point sources interferes with the reference beam, giving rise to an interference pattern. The
resulting pattern is the sum of a large number (strictly speaking, an infinite number) of point
source + reference beam interference patterns.
When the object is no longer present, the holographic plate is illuminated by the reference beam.
Each point source diffraction grating will diffract part of the reference beam to reconstruct the
wavefront from its point source. These individual wavefronts add together to reconstruct the
whole of the object beam.
The viewer perceives a wavefront that is identical to the scattered wavefront of the object
illuminated by the reference beam, so that it appears to him or her that the object is still in place.
This image is known as a "virtual" image as it is generated even though the object is no longer
there. The direction of the light source seen illuminating the virtual image is that of the original
illuminating beam.
This explains, albeit in somewhat simple terms, how transmission holograms work. Other
holograms, such as rainbow and Denisyuk holograms, are more complex but have similar
principles.
To reconstruct the object exactly from a transmission hologram, the reference beam must have the
same wavelength and curvature, and must illuminate the hologram at the same angle as the
original reference beam (i.e. only the phase can be changed). Departure from any of these
conditions will give a distorted reconstruction. While nearly all holograms are recorded using
lasers, a narrow-band lamp or even sunlight is enough to recognize the reconstructed image.
Many different holograms may be stored in the same crystal volume by changing the angle of
incidence of the reference beam. One characteristic of the recording medium that limits the
usefulness of holographic storage is the property that every time the crystal is read with the
reference beam, the stored hologram at that “location” is disturbed by the reference beam and
some of the data integrity is lost. With current technology, recorded holograms in Fe- and Tb-
doped LiNbO3 that use UV light to activate the Tb atoms can be preserved without significant
decay for two years.
A series of spectral memory demonstration experiments have been conducted at the University of
Oregon. These experiments employ a 780-nm commercial semiconductor diode laser as the light
source, a crystal of Tm3+:YAG as the frequency selective recording material, and an avalanche
photodiode as a signal detector. The diode laser was stabilized to an external cavity containing a
grating and an electro optic crystal. The intracavity electro optic crystal provides for microsecond
timescale sweeping of the laser frequency over roughly one gigahertz. Two storage (reference and
data) beams and one reading beam, are created from the output of the single laser source using the
beam splitter and the acousto-optic modulators shown in figure. The beams are focused to a 150
m2 spot in a Tm3+:YAG crystal. The reference and data beams are simultaneous as are the read
and signal beams.
The SLM is an optical device that is used to convert the real image into a single beam of light that
will intersect with the reference beam during recording. Phase-conjugate holography eliminates
these optical parts by replacing the reference beam that is used to read the hologram with a
conjugate reference beam that propagates in the opposite direction as the bam used for recording.
The signal diffracted by the hologram being accessed is sent back along the path from which it
came, and is refocused onto the SLM, which now serves as both the SLM and the detector.
There are two main classes of materials used for the holographic storage medium. These are
photorefractive crystals and photopolymers (organic films). The most commonly used
photorefractive crystals used are LiNbO3 and BaTiO3. During hologram recording, the refractive
index of the crystal is changed by migration of electron charge in response to the imprinted three
dimensional interference patterns of reference and signal beams. As more and more holograms are
superimposed into the crystal, the more decay of the holograms occurs due to interference from
the superimposed holograms. Also, holograms are degraded every time they are read out because
the reference beam used to read out the hologram alters the refractive nature of the crystal in that
region. Photorefractive crystals are suitable for random access memory with periodic refreshing of
data, and can be erased and written to many times. Photopolymers have been developed that can
also be used as a holographic storage medium.
APPLICATION TO BINARY
In order for holographic technology to be applied to computer systems, it must store data in a form
that a computer can recognize. In current computer systems, this form is binary. For this the
source beam is manipulated. In computer applications, this manipulation is in the form of bits. The
next section explains the spatial light modulator, a device that converts laser light into binary data.
Angular Multiplexing
When a reference beam recreates the source beam, it needs to be at the same angle it was during
recording. A very small alteration in this angle will make the regenerated source beam disappear.
Harnessing this property, angular multiplexing changes the angle of the source beam by very
minuscule amounts after each page of data is recorded. Depending on the sensitivity of the
recording material, thousands of pages of data can be stored in the same hologram, at the same
point of laser beam entry. Staying away from conventional data access systems which move
mechanical matter to obtain data, the angle of entry on the source beam can be deflected by high-
frequency sound waves in solids. The elimination of mechanical access methods reduces access
times from milliseconds to microseconds.
Figure above shows a compact module that uses angular multiplexing. The module is composed of
a photorefractive crystal in which holograms are stored, a pair of liquid crystal beam steerers (one
of which is hidden behind the crystal) that is responsible for angularly multiplexing holograms in
the crystal, and an Opto Electronic Integrated Circuit (OEIC) that merges the functions of a
reflective spatial light modulator (SLM) for recording holograms and a detector array for readout.
One is aligned at unit magnification with the photo detectors that sense it, because of the
conjugate nature of the readout process and because the detectors are located within the same
OEIC pixels as the modulators used to record the holograms. Furthermore, the OEIC provides a
solution to the volatility of holograms stored in a read–write photorefractive memory.
Wavelength Multiplexing
Used mainly in conjunction with other multiplexing methods, wavelength multiplexing alters the
wavelength of source and reference beams between recordings. Sending beams to the same point
Computer Science Department Holographic Data Storage System (HDSS) 33
of origin in the recording medium at different wavelengths allows multiple pages of data to be
recorded. Due to the small tuning range of lasers, however, this form of multiplexing is limited on
its own.
Spatial Multiplexing
Spatial multiplexing is the method of changing the point of entry of source and reference beams
into the recording medium. This form tends to break away from the non-mechanical paradigm
because either the medium or recording beams must be physically moved. Like wavelength
multiplexing, this is combined with other forms of multiplexing to maximize the amount of data
stored in the holographic volume. Two commonly used forms of spatial multiplexing are
peristrophic multiplexing and shift multiplexing.
Phase-Encoded Multiplexing
The form of multiplexing farthest away from using mechanical means to record many pages in the
same volume of a holograph is called phase-encoded multiplexing. Rather than manipulate the
angle of entry of a laser beam or rotate/translate the recording medium, phase encoded
multiplexing changes the phase of individual parts of a reference beam. The main reference beam
is split up into many smaller partial beams which cover the same area as the original reference
beam. These smaller beamlets vary by phase which changes the state of the reference beam as a
whole. The reference beams intersects the source beam and records the diffraction relative to the
different phases of the beamlets. The phase of the beamlets can be changed by nonmechanical
means, therefore speeding up access times.
No single multiplexing method by itself is the best way to pack a hologram full of information.
The true power of multiplexing is brought out in the combination of one or more methods. Hybrid
wavelength and angular multiplexing systems have been tested and the results are promising.
Recent tests have also been formed on spatial multiplexing methods which create a hologram the
size of a compact disc, but which hold 500 times more data.
ERROR CORRECTION
It is inevitable that storing massive amounts of data in a small volume will be error prone. Factors
exist in both the recording and retrieval of information which will be covered in the following
subsections, respectively. In order for holographic memory systems to be practical in next
generation computer systems, a reliable form of error control needs to be created.
Recording Errors
When data is recorded in holographic medium, certain factors can lead to erroneously recorded
data. One major factor is the electronic noise generated by laser beams. When a laser beam is split
up (for example, through a SLM), the generated light bleeds into places where light was meant to
be blocked out. Areas where zero light is desired might have minuscule amounts of laser light
present, which mutates its bit representation. For example, if too much light gets recorded into this
zero area representing a binary 0, an erroneous change to a binary 1 might occur. Changes in both
the quality of the laser beam and recording material are being researched, but these improvements
Like error control, the I/O interface to modern computer systems needs to be tailored to data
retrieval in page format. Bits are no longer read from a stream, they are sent to the computer as
sheets. Clearly the I/O interface needs to be changed to accommodate for this. One of the
problems with such large amounts of data being fed to a processor is that the incoming data may
exceed the processor’s throughput. This is where interfacing needs to bridge the data in a coherent
fashion between memory and processor. In the following subsections, two kinds of interfacing are
covered which vary in a unique way.
Smart interfacing is a method of controlling the way data is sent to the processor from holographic
memory by a pre-defined set of logical commands. These logical commands come from outside
the stored memory and are provided to control the way data is managed before going to the
processor. An example of these pre-defined instructions are the fixed set of rules used by error
detection and correction. Because these rules stay the same throughout memory retrieval, they can
be hard coded into the smart interfacing agent.
Intelligent Interfacing
Seemingly the same as smart interfacing by name, intelligent interfacing is different in one
important way. Intelligent interfacing has external control signals which can be manipulated to
transform incoming data in a non-static manner. These signals create a way for the intelligent
interfacing agent to reduce the incoming data in a meaningful way. For example, a data mining
system could utilize these control signals to ignore certain data which is not a part of the pattern
being searched for. Intelligent interfacing agents can contain the functionality of smart interfaces
such as error control, but have the added feature of dynamically changing the way data passes
through it.
The most common holographic recording system uses laser light, a beam splitter to divide the
laser light into reference beam and signal beam, various lenses and mirrors to redirect the light, a
photo reactive crystal, and an array of photo detectors around the crystal to receive the
holographic data. To record a hologram, a beam laser light is split into two beams by mirror.
These two beams then become the reference and the signal beams. The signal beam interacts with
an object and the light that is reflected by the object intersects the reference beam at right angles.
The resulting interference pattern contains all the information necessary to recreate the image of
the object after suitable processing. The interference pattern is recorded on to a photo reactive
material and may be retrieved at a later time by using a beam that is identical to the reference
beam. This is possible because the hologram has the property that if it is illuminated by either of
the beams used to record it, the hologram causes light to be diffracted in the direction of the
In the memory hierarchy, holographic memory lies somewhere between RAM and magnetic
storage in terms of data transfer rates, storage capacity, and data access times. The theoretical
limit of the number of pixels that can be stored using volume holography is V2/3/2 where V is the
volume of the recording medium and is the wavelength of the reference beam.
For green light, the maximum theoretical storage capacity is 0.4 Gbits/cm2 for a page size of 1 cm
x 1 cm. Also, holographic memory has an access time near 2.4 ms, a recording rate of 31 KB/s,
and a readout rate of 10 GB/s. Modern magnetic disks have data transfer rates in the neighborhood
of 5 to 20 MB/s. Typical DRAM today has an access time close to 10 – 40 ns, and a recording rate
of 10 GB/s
Table 1: The table on the next page shows the comparison of access time, data transfer rates
(readout), and storage capacity (storage density) for three types of memory; holographic, RAM,
and magnetic disk.
Main Memory
10 – 40 ns 5 MB/s 4.0 Mbits/ cm2
(RAM)
Holographic memory has an access time somewhere between main memory and magnetic disk, a
data transfer rate that is an order of magnitude better than both main memory and magnetic disk,
and a storage capacity that is higher than both main memory and magnetic disk. Certainly if the
issues of hologram decay and interference are resolved, then holographic memory could become a
part of the memory hierarchy, or take the place of magnetic disk much as magnetic disk has
displaced magnetic tape for most applications.
Figure shows the most important hardware components in a holographic storage system: the SLM
used to imprint data on the object beam, two lenses for imaging the data onto a matched detector
array, a storage material for recording volume holograms, and a reference beam intersecting the
object beam in the material. What is not shown in Figure is the laser source, beam-forming optics
for collimating the laser beam, beam splitters for dividing the laser beam into two parts, stages for
aligning the SLM and detector array, shutters for blocking the two beams when needed, and wave
plates for controlling polarization.
The optical system shown in Figure, with two lenses separated by the sum of their focal lengths, is
called the “4-f” configuration, since the SLM and detector array turn out to be four focal lengths
apart. Other imaging systems such as the Fresnel configuration (where a single lens satisfies the
imaging condition between SLM and detector array) can also be used, but the 4-f system allows
the high numerical apertures (large ray angles) needed for high density. In addition, since each
lens takes a spatial Fourier transform in two dimensions, the hologram stores the Fourier
transform of the SLM data, which is then Fourier transformed again upon readout by the second
lens. This has several advantages: Point defects on the storage material do not lead to lost bits, but
result in a slight loss in signal-to-noise ratio at all pixels; and the storage material can be removed
and replaced in an offset position, yet the data can still be reconstructed correctly. In addition, the
Fourier transform properties of the 4-f system lead to the parallel optical search capabilities
offered by holographic associative retrieval. The disadvantages of the Fourier transform geometry
come from the uneven distribution of intensity in the shared focal plane of the two lenses, which
we discuss in the axicon section below.
PRISM tester
The PRISM tester, built as part of the DARPA Photo Refractive Information Storage Materials
consortium, was designed to allow the rigorous evaluation of a wide variety of holographic
storage materials. This tester was designed for extremely low-baseline BER performance,
flexibility with regard to sample geometry, and high stability for both long recording exposures
and experimental repeatability. The salient features of the PRISM tester are shown in Figure.
The SLM is a chrome-on-glass mask, while the detector array is a lowframe- rate, 16-bit-per-pixel
CCD camera. Custom optics of long focal length (89 mm) provide pixel matching over data pages
Figure 4:
Primary feature of the PRISM holographic
materials test apparatus. The SLM is chrome-
on-glass, and the detector array a 1024 x
1024 portion of a large CCD camera. A pair
of precision rotation stages allows the
reference beam to enter the storage material
under test at any horizontal incidence angle.
as large as one million pixels, or one megapel. A
pair of precision rotation stages directs the reference
beam, which is originally below the incoming object
beam, to the same horizontal plane as the object
beam. By rotating the outer stage twice as far as the
inner, the reference-beam angle can be chosen from the entire 360-degree angle range, with a
repeatability and accuracy of approximately one microradian. (Note, however, that over two 30-
degree-wide segments within this range, the reference-beam optics occlude some part of the
objectbeam path.) The storage material is suspended from a three-legged tower designed for
interferometric stability (better than 0.1 mm) over time periods of many seconds. The secondary
optics occupy approximately 2 feet by 4 feet of optical table space, and the tower and stages
Figure 6:
Salient feature of the DEMON I holographic digital data storage engine.
A five-element zoom lens demagnifies the SLM to an intermediate image
plane, which is then imaged to the CCD detector with a pair of lenses. The
reference beam and object beams enter orthogonal faces of a LiNbO 3
crystal; a galvanometrically actuated scanner changes the reference beam
angle over ±10 degrees about the normal.
apparatus prior to the object/reference beamsplitter was as much as 400 mW. Simple linear stages
move the SLM in two axes and the CCD in three axes for alignment. The entire system, not
including the laser, occupies 18 3 24 inches of optical table space.
The first experiment performed on the DEMON I tester was the demonstration of multiple
hologram storage at low raw BER (BER without error correction) using modulation codes, which
allow decoding over smaller pixel blocks than the global thresholding described above. Using an
8-mm-thick LiNbO3:Fe crystal storage medium and a strong modulation code (8:12), 1200
holograms were superimposed and read back in rapid succession with extremely low raw BER (,2
3 1028) . In addition, the DEMON I platform has been used to implement both associative
retrieval and phase-conjugate readout, as described below.
Figure 7:
Primary features of the DEAMON II holographic digital data storage
engine. Utilizing 30-mm-focal-length. Fourier transform lenses in the
90-degree geometry with a one-million-pixel SLM, this system has
demonstrated areal storage densities in excess of 100 bits/µm 2
The short focal length of the DEMON II optics allows the system to demonstrate high areal
storage densities (the storage capacity of each stack of holograms, divided by the area of the
limiting aperture in the object beam). Since the lenses in the object beam implement a two
dimensional spatial Fourier transform, an aperture placed in the central focal plane of the 4-f
system (just in front of the storage material) can be described as a spatial lowpass filter. The
smaller the volume allocated to each stack of holograms, the larger the capacity of a given large
block of storage material. However, if the aperture is decreased too far, some of the information
from the SLM fails to pass through the aperture. The size of the smallest tolerable aperture
corresponds to the spatial equivalent of the Nyquist sampling condition, in which the spatial
frequency sampling on the SLM (one over its pixel pitch) is twice the maximum spatial frequency
allowed to pass the limiting aperture. Only for apertures equal to or larger than this so-called
“Nyquist” aperture is the information from all pixels of the SLM guaranteed to pass to the detector
array. Since both “positive” and “negative” spatial frequencies are represented in a centered
aperture, the Nyquist aperture turns out to be equal to the inverse of the pixel pitch of the SLM,
scaled by the wavelength and the focal length of the lenses. The design of the imaging optics is
then complicated by this need for short focal length, since the maximum ray angle (and thus the
potential for optical aberrations) is greatly increased. The optical distortion (displacement of pixel
centers from a rectangular grid) in the DEMON II platform is consequently much larger than in
the other two testers, reaching approximately 0.03% (0.3 pixels) in the corners of the received data
page. The development of signal-processing algorithms to compensate for his mis registration
between SLM and CCD pixels is a research topic that we are currently pursuing, with some initial
success.
Axicon
As previously noted, the Fourier transform process used to focus the object beam into the storage
media has the side effect of producing an undesired high-intensity peak on the optical axis. This
intensity spike can easily saturate the photosensitive response of the storage media, resulting in
severe degradation of both transmitted images and stored holograms. It has been known for many
years that a potential solution to this problem can be implemented by superimposing a random
phase distribution on the pixels of the SLM. In work performed by M.-P. Bernal et al. at IBM
Almaden, it was shown that although such a “random phase mask” does redistribute the intensity
in this spike, the alignment of such a phase mask is critical, and new optical artifacts (dark lines
and interference fringe effects) are introduced in the transmitted image. These artifacts, along with
the difficulty of maintaining the alignment of yet another pixelated component, have made it
improbable that random phase masks will be the solution to the coherent saturation problem.
Figure 8:
A convex axicon deflects the Fourier components of the pixelated
data mask into a toroid. This spreads out the rays, which would
otherwise focus to a single large spike at the optical axis of the
focal plane and distort holographic data pages
Aspherical apodizer
Typical laser beams have a spatial profile dictated by the oscillation mode of the laser resonator,
with the simplest mode having a Gaussian or bell-shaped profile. The simplest method for
generating a beam with a uniform (or flat) spatial profile is to simply expand a Gaussian beam and
use only the center portion. The power efficiency then trades off directly with the desired flatness
of illumination: If an illumination flatness of 5% is required over a certain area, only 5% of the
incident beam power can actually be used. It has long been desirable in laser physics to be able to
efficiently generate a laser beam with a uniform cross section. Although many ingenious solutions
have been proposed, the few that have been implemented generally work only over the first 1/e
field points of the original Gaussian beam, and commonly suffer from poor flatness, severe
diffraction effects, and distortion of the wavefront quality of the apodized beam. In addition, many
solutions, including diffractive optics, create a beam which attains uniform intensity in one plane
in space, but then diverges and distorts away from that plane.
As part of the design of DEMON II, the creation of “flat-top” beams was studied. This was
germane not only to DEMON II, but also to ongoing work in deep-UV lithography. A new insight
was obtained after a review of historical efforts in this field. A two-element telescope with
transmissive optical elements was designed that produces a highly efficient flat-top laser beam
with the capability of propagating for several meters with little distortion and diffraction-limited
wavefront quality. The Gaussian-beam-to-flat-top converter utilizes a convex aspheric lens to
introduce aberrations into the beam, redistributing the laser power from a particular incident
Gaussian profile to the desired flat-top profile with a rapid-intensity roll-off at the edge. A second
aspheric optic recollimates the aberrated beam, restoring the wavefront quality and allowing it to
propagate for long distances without spreading. As a result, the central 60% of the output power
will be uniform in intensity to 2%, and 99.7% of the incident laser beam power is used in the
output apodized beam. The roll-off of the intensity profile was carefully crafted to minimize
diffraction effects from the edge of the beam during propagation. Although the input and output
beam dimensions are fixed for a given apodizer, it was discovered that a single apodizer could be
used from the deep UV into the far IR with only a simple focus adjustment.
Phase-conjugate readout
As described in the previous sections on tester platforms, the need for both high density and
excellent imaging requires an expensive short-focal-length lens system corrected for all
aberrations (especially distortion) over a large field, as well as a storage material of high optical
quality. Several authors have proposed bypassing these requirements by using phase-conjugate
readout of the volume holograms. After the object beam is recorded from the SLM with a
reference beam, the hologram is reconstructed with a phase-conjugate (time reversed copy) of the
original reference beam. The diffracted wavefront then retraces the path of the incoming object
beam in reverse, canceling out any accumulated phase errors. This should allow data pages to be
retrieved with high fidelity with a low-performance lens, from storage materials fabricated as
multimode fibers, or even without imaging lenses for an extremely compact system.
Most researchers have relied on the visual quality of retrieved images or detection of isolated fine
structure in resolution targets as proof that phase-conjugate retrieval provides high image fidelity.
This, however, is no guarantee that the retrieved data pages will be correctly received by the
detector array. In fact, the BER of pixel-matched holograms can be used as an extremely sensitive
measure of the conjugation fidelity of volume holograms. Any errors in rotation, focus, x-y
registration, magnification, or residual aberrations will rapidly increase the measured bit-error rate
(BER) for the data page. Using the pixel-matched optics in both the DEMON I platform and the
PRISM tester, we have implemented low-BER phase-conjugate readout of large data pages. On
the PRISM tester, phase conjugation allowed the readout of megapel pages through much smaller
apertures than in the original megapel experiment mentioned above, which was performed without
Figure 9:
A pair of optical elements with aspheric surfaces
distributes the power from an input beam with a
Gaussian profile, resulting in an output beam of
uniform intensity within a given region.
phase conjugation. This demonstrates a thirtyfold increase in areal density per hologram.
the megapel mask onto a mirror placed halfway between the mask and CCD. After deflection by
this mirror, the object beam was collected by a second lens, forming an image of the mask. Here
an Fe-doped LiNbO3 crystal was placed to store a hologram in the 90-degree geometry. After
passing through the crystal, the polarization of the reference beam was rotated and the beam was
focused into a self-pumped phaseconjugate mirror using a properly oriented, nominally undoped
BaTiO3 crystal. In such a configuration, the input beam is directed through the BaTiO 3 crystal and
into the far corner, creating random backscattering throughout the crystal. It turns out that counter-
propagating beams (one scattered upon input to the crystal, one reflected from the back face) are
preferentially amplified by the recording of real-time holograms, creating the two “pump” waves
for a four-wave-mixing process. Since momentum (or wave vector) must be conserved among
four beams (energy is already conserved because all four wavelengths are identical), and since the
two “pump” beams are already counter-propagating, the output beam generated by this process
must be the phase-conjugate to the input beam.
The crystal axes of the LiNbO 3 were oriented such that the return beam from the phase-conjugate
mirror wrote the hologram, and the strong incoming reference beam was used for subsequent
readout. (Although both mutually phase-conjugate reference beams were present in the LiNbO 3
during recording, only the beam returning from the phase-conjugate mirror wrote a hologram
because of the orientation of the LiNbO3 crystal axes. For readout, the phase-conjugate mirror was
blocked, and the incoming reference beam read this hologram, reconstructing a phase-conjugate
object beam.) By turning the mirror by 90 degrees, this phase-conjugate object beam was
deflected to strike the pixel-matched CCD camera. We were able to store and retrieve a megapel
hologram with only 477 errors (BER = 5 x 10-4) after applying a single global threshold. The
experiment was repeated with a square aperture of 2.4 mm on a side placed in the object beam at
the LiNbO3 crystal, resulting in 670 errors. Even with the large spacing between SLM and CCD,
this is already an areal density of 0.18 bits per mm2 per hologram. In contrast, without phase-
conjugate readout, an aperture of 14 mm 3 14 mm was needed to produce low BERs with the
custom optics. The use of phase conjugate readout allowed mapping of SLM pixels to detector
Computer Science Department Holographic Data Storage System (HDSS) 50
pixels over data pages of 1024 pixels x 1024 pixels without the custom imaging optics, and
provided an improvement in areal density (as measured at the entrance aperture of the storage
material) of more than 30.
Having discussed the optical components that imprint and detect information, we move to a
Portions of data pages holographically reconstructed through a phase aberration (a) without phase-
conjugate readout (BER: 5 x 10 -2); (b) with phase-conjugate readout (BER < 10 -5), thus canceling out
accumulated phase errors.
discussion of coding and signal processing, and the best possible use of these components to
record and retrieve digital data from a holographic data storage system.
In a data-storage system, the goal of coding and signal processing is to reduce the BER to a
sufficiently low level while achieving such important figures of merit as high density and high
data rate. This is accomplished by stressing the physical components of the system well beyond
the point, at which the channel is error-free, and then introducing coding and signal processing
schemes to reduce the BER to levels acceptable to users. Although the system retrieves raw data
from the storage device with many errors (a high raw BER), the coding and signal processing
ensures that the user data are delivered with an acceptably low level of error (a low user BER).
Coding and signal processing can involve several qualitatively distinct elements. The cycle of user
data from input to output can include interleaving, error correction- code (ECC) and modulation
encoding, signal preprocessing, data storage in the holographic system, hologram retrieval, signal
post-processing, binary detection, and decoding of the interleaved ECC.
The ECC encoder adds redundancy to the data in order to provide protection from various noise
sources. The ECC-encoded data are then passed on to a modulation encoder which adapts the data
to the channel: It manipulates the data into a form less likely to be corrupted by channel errors and
more easily detected at the channel output. The modulated data are then input to the SLM and
stored in the recording medium. On the retrieving side, the CCD returns pseudo-analog data
values (typically camera count values of eight bits) which must be transformed back into digital
data (typically one bit per pixel). The first step in this process is a post-processing step, called
equalization, which attempts to undo distortions created in the recording process, still in the
pseudo-analog domain. Then the array of pseudo-analog values is converted to an array of binary
digital data via a detection scheme. The array of digital data is then passed first to the modulation
decoder, which performs the inverse operation to modulation encoding, and then to the ECC
decoder. In the next subsections, we discuss several sources of noise and distortion and indicate
how the various coding and signal-processing elements can help in dealing with these problems.
Within a sufficiently small region of the detector array, there is not much variation in pixel
intensity. If the page is divided into several such small regions, and within each region the data
patterns are balanced (i.e., have an equal number of 0s and 1s), detection can be accomplished
without using a threshold. For instance, in sorting detection, letting N denote the number of pixels
in a region, one declares the N/ 2 pixels with highest intensity to be 1s and those remaining to be
0s. This balanced condition can be guaranteed by a modulation code which encodes arbitrary data
patterns into codewords represented as balanced arrays. Thus, sorting detection combined with
balanced modulation coding provides a means to obviate the inaccuracies inherent in threshold
detection. The price that is paid here is that in order to satisfy the coding constraint (forcing the
number of 0s and 1s to be equal), each block of N pixels now represents only M bits of data. Since
M is typically less than N, the capacity improvement provided by the code must exceed the code
rate, r= M/N. For example, for N = 8, there are 70 ways to combine eight pixels such that exactly
four are 1 and four are 0. Consequently, we can store six bits of data (64 different bit sequences)
for a code rate of 75%. The code must then produce a >33% increase in the number of holographic
pages stored, in order to increase the total capacity of the system in bits.
One problem with this scheme is that the array detected by sorting may not be a valid codeword
for the modulation code; in this case, one must have a procedure which transforms balanced arrays
into valid code words. This is not much of a problem when most balanced arrays of size N are
code words, but for other codes this process can introduce serious errors. A more complex but
more accurate scheme than sorting is correlation detection, as proposed in. In this scheme, the
detector chooses the codeword that achieves maximum correlation with the array of received pixel
intensities. In the context of the 6:8 code described above, 64 correlations are computed for each
code block, avoiding the six combinations of four 1 and four 0 pixels that are not used by the code
but which might be chosen by a sorting algorithm.
Interpixel interference
Interpixel interference is the phenomenon in which intensity at one particular pixel contaminates
data at nearby pixels. Physically, this arises from optical diffraction or aberrations in the imaging
system. The extent of interpixel interference can be quantified by the point-spread function,
sometimes called a PSF filter. If the channel is linear and the PSF filter is known, the interpixel
interference can be represented as a convolution with the original (encoded) data pattern and then
“undone” in the equalization step via a filter inverse to the PSF filter (appropriately called
deconvolution).
Deconvolution has the advantage that it incurs no capacity overhead (code rate of 100%).
However, it suffers from mismatch in the channel model (the physics of the intensity detection
makes the channel nonlinear), inaccuracies in estimation of the PSF, and enhancement
of random noise. An alternative approach to combating interpixel interference is to forbid certain
patterns of high spatial frequency via a modulation code. According to the model in, for certain
realistic and relatively optimal choices of system parameters (in particular at the Nyquist aperture
described above), if one forbids a 1 surrounded by four 0s (in its four neighbors on the cardinal
points of the compass), areal density can be improved provided that the modulation code has a rate
A code that forbids a pattern of high spatial frequency (or, more generally, a collection of such
patterns of rapidly varying 0 and 1 pixels) is called a low-pass code. Such codes constrain the
allowed pages to have limited high spatial frequency content. A general scheme for designing
such codes is given in, via a strip encoding method in which each data page is encoded, from top
to bottom, in narrow horizontal pixel strips. The constraint is satisfied both along the strip and
between neighboring strips. Codes that simultaneously satisfy both a constant-weight constraint
and a low-pass constraint are given in.
Assume for simplicity that our choice of ECC can correct at most two byte errors per codeword. If
the codewords are interleaved so that any cluster error can contaminate at most two bytes in each
codeword, the cluster error will not defeat the error-correcting power of the code. Interleaving
schemes such as this have been studied extensively for one-dimensional applications (for which
cluster errors are known as burst errors). However, relatively little work has been done on
interleaving schemes for multidimensional applications such as holographic recording. One recent
exception is a class of sophisticated interleaving schemes for correcting multidimensional cluster
errors developed in.
For certain sources of error, it is reasonable to assume that the raw-BER distribution is fixed from
hologram to hologram. Thus, the raw-BER distribution across the page can be accurately
estimated from test patterns. Using this information, codewords can then be interleaved in such a
way that not too many pixels with high raw BER can lie in the same codeword (thereby lowering
the probability of decoder failure or miscorrection). This technique, known as matched
interleaving, can yield a significant improvement in user BER.
Gray scale
The previous sections have shown
that the coding introduced to maintain
acceptable BER comes with an
unavoidable overhead cost, resulting
in somewhat less than one bit per
pixel. The predistortion technique
described in the previous section
Intensity histogram for high-areal-density holograms (a) makes it possible to record data pages
before and (b) after applying the predistortion technique. containing gray scale. Since we record
Before, interpixel crosstalk broadens the brightness and detect more than two brightness
distribution; after, these deterministic variations are
reduced, improving the BER of the system
If pixels take one of g brightness levels, each pixel can convey log2 g bits of data. The total
amount of stored information per page has increased, so gray-scale encoding appears to produce a
straightforward improvement in both capacity and readout rate. However, gray scale also divides
the system’s signal-to-noise ratio (SNR) into g - 1 part, one for each transition between
brightness levels. Because total SNR depends on the number of holograms, dividing the SNR for
gray scale (while requiring the same error rate) leads to a reduction in the number of holograms
that can be stored. The gain in bits per pixel must then outweigh this reduction in stored
holograms to increase the total capacity in bits.
Each search, initiated by a user query, ran under computer control, including display of the
appropriate patterns, detection of the correlation peaks (averaging eight successive measurements
to reduce detector noise), calibration by hologram strength, identification of the eight highest
correlation scores, mapping of correlation bins to reference-beam angle, address-based recall of
these eight holograms, decoding of the pixel-matched data pages, and, finally, display of the
binary images on the computer monitor. The optical readout portion occupied only 0.25 s of the
total 5-s cycle time. To find images based on color similarity, the 64 sliders were used to input the
color histogram information for the upper left image in Figure 17(a). The slider patterns for this
color histogram were input to the system on the SLM, resulting in 100 reconstructed reference
beams. After detection, calibration, and ranking of these 100 correlation peaks, the reference
beams for the brightest eight were input to the system again, resulting in eight detected data pages
and thus eight decoded binary images. Figure 17(a) shows the first four of these images, indicating
that the holographic search process found these images to be those which most closely matched
the color histogram query. Figure 17(b) quantifies the search fidelity by plotting the detected
correlation peak intensity as a function of the overlap between the object-beam search patterns.
Perfect system performance would result in a smooth monotonic curve; however, noise in the real
system introduces deviations away from this curve. As expected, the feature vector for the left-
hand image correlated strongly with itself, but the system was also able to correctly identify the
images with the highest cross-correlation.
These sliders could also be used to select images by color distribution. Figures 17(c) and 17(d)
correspond to a search for images containing 20% white and 20% light gray. Although several
images were ranked slightly higher than they deserved (red circle), the system performance was
impressive, considering that the background “dark” signal was twice as large as the signal. In
Figures 17(e) and 17(f), the alphanumeric description field was used to search for the keyword
shore. Note that because many characters are involved, both the expected and measured scores are
large. However, we obtained similar results for exact search arguments as small as a single
character.
With the fuzzy coding techniques we have introduced, volume holographic content-addressable
data storage is an attractive method for rapidly searching vast databases with complex queries.
Areas of current investigation include implementing system architectures which support many
thousands of simultaneously searched records, and quantifying the capacity– reliability tradeoffs.
Thus far, we have discussed the effects of the hardware, and of coding and signal processing, on
the performance of holographic data storage systems. Desirable parameters described so far
include storage capacity, data input and output rates, stability of stored data, and device
compactness, all of which must be delivered at a specified (very low) user BER. To a large extent,
the possibility of delivering such a system is limited by the properties of the materials available as
storage media. The connections between materials properties and system performance are
complex, and many tradeoffs are possible in adapting a given material to yield the best results.
Here we attempt to outline in a general way the desirable properties for a holographic storage
medium and give examples of some promising materials.
Properties of foremost importance for holographic storage media can be broadly characterized as
“optical quality,” “recording properties,” and “stability.” These directly affect the data density and
capacity that can be achieved, the data rates for input and output, and the BER.
As mentioned above, for highest density at low BER, the imaging of the input data from the SLM
to the detector must be nearly perfect, so that each data pixel is read cleanly by the detector. The
recording medium itself is part of the imaging system and must exhibit the same high degree of
perfection. Furthermore, if the medium is moved to access different areas with the readout beam,
this motion must not compromise the imaging performance. Thus, very high standards of optical
homogeneity and fabrication must be maintained over the full area of the storage medium. With
sufficient materials development effort and care in fabrication, the necessary optical quality has
been achieved for both inorganic photorefractive crystals and organic photopolymer media. As
discussed above, phase-conjugate readout could ultimately relax these requirements.
A more microscopic aspect of optical quality is intrinsic light scattering of the material. The
detector noise floor produced by scattering of the readout beam imposes a fundamental minimum
on the efficiency of a stored data hologram, and thus on the storage density and rate of data
readout [38]. Measurements on the PRISM tester have shown that, in general, the best organic
media have a higher scattering level than inorganic crystals, by about a factor of 100 or more.
Because holography is a volume storage method, the capacity of a holographic storage system
tends to increase as the thickness of the medium increases, since greater thickness implies the
ability to store more independent diffraction gratings with higher selectivity in reading out
individual data pages without crosstalk from other pages stored in the same volume. For the
storage densities necessary to make holography a competitive storage technology, a media
thickness of at least a few millimeters is highly desirable. In some cases, particularly for organic
materials, it has proven difficult to maintain the necessary optical quality while scaling up the
thickness, while in other cases thickness is limited by the physics and chemistry of the recording
process.
Holographic recording properties are characterized in terms of sensitivity and dynamic range.
Sensitivity refers to the extent of refractive index modulation produced per unit exposure (energy
per unit area). Diffraction efficiency (and thus the readout signal) is proportional to the square of
the index modulation times the thickness. Thus, recording sensitivity is commonly expressed in
terms of the square root of diffraction efficiency,
The term dynamic range refers to the total response of the medium when it is divided up among
many holograms multiplexed in a common volume of material; it is often parameterized as a
quantity known as M# (pronounced “M-number” [39]), where
M# = Σ 1/2
(3)
and the sum is over the M holograms in one location. The M# also describes the scaling of
diffraction efficiency as M is increased, i.e,
= (M#/M) 2 (4)
Dynamic range has a strong impact on the data storage density that can be achieved. For example,
to reach a density of 100 bits/mm2 (64 Gb/in.2) with megapixel data pages, a target diffraction
efficiency of 3 x 10-5, and area at the medium of 0.1 cm2 would require M# = 5, a value that is
barely achievable with known recording materials under exposure conditions appropriate for
recording high fidelity data holograms.
The diffusion-driven photopolymer systems offer very high sensitivity and need no such post
exposure processing. The basic mechanism is a photosensitized polymerization, coupled with
diffusion of monomer and other components of the material formulation under influence of the
resulting concentration gradients. The medium is usually partially prepolymerized to produce a
gel-like matrix, allowing rapid diffusion at room temperature. Refractive index modulation and
recording of holograms result from both the density change and the difference in polarizability of
the polymerized material. The magnitude of this refractive index modulation can be very high,
resulting in a high dynamic range. For simple plane-wave holograms, an M# as high as 42 has
been observed. For digital data holograms, the contrast of the interference pattern between object
and reference beams is lower than in the plane-wave case and the recording conditions do not
produce as large an index modulation. Even so, the M# observed for digital holograms on the
PRISM materials tester is around 1.5, one of the highest yet observed; this value can undoubtedly
be improved by optimization of the recording conditions.
The recording mechanism for photopolymers also leads to some disadvantages, including the
shrinkage of the material with polymerization and the possibility of nonlinear response. Both of
these distort the reconstructed holograms and thus cause errors in decoding the digital data. For
some photopolymers, significant advances have been made toward eliminating these undesired
properties; for example, shrinkage has been reduced to less than 0.1% while sufficient useful
dynamic range for recording of data has been retained. There are additional problems in increasing
the thickness of these materials to the millimeter scale that is desirable for holography, and even
then the Bragg angle selectivity is not sufficient to allow enough holograms to be written in a
common volume to achieve high data density. However, through the use of nonselective
multiplexing methods, it is possible to increase the density to a competitive level. One of these
methods, known as peristrophic multiplexing, involves the rotation of the medium about an axis
normal to its plane such that the reconstructed hologram image rotates away from the detector,
allowing another hologram to be written and read. We have recently demonstrated the recording
and readout with very low error rate of 70 holograms of 256 Kb each on the PRISM tester, using a
combination of Bragg angle and peristrophic multiplexing.
Photopolymer materials have undergone rapid development and show great potential as write-
once holographic media. Because of this rapid development, there is relatively little research
The best of the photopolymers are promising as storage media for WORM data storage. The
photorefractive crystals have traditionally been the favorite candidates for reversible, rewritable
storage; recent work on two-color recording has shown the way to a possible solution of the
volatility of reversible media during readout. The following section describes this concept.
Two main schemes for providing nondestructive readout have been proposed, both in lithium
niobate, although the concepts are applicable to a broader range of materials. The first was thermal
fixing, in which a copy of the stored index gratings is made by thermally activating proton
diffusion, creating an optically stable complementary proton grating. Because of the long times
required for thermal fixing and the need to fix large blocks of data at a time, thermally fixed media
somewhat resemble reusable WORM materials. Another class of fixing process uses two
wavelengths of light. One approach uses two different wavelengths of light for recording and
reading, but for storage applications this suffers from increased crosstalk and restrictions on the
spatial frequencies that can be recorded. The most promising two-color scheme is “photon-gated”
recording in photorefractive materials, in which charge generation occurs via a two-step process.
Coherent object and reference beams at a wavelength l1 record information in the presence of
gating light at a wavelength l2 . The gating light can be incoherent or broadband, such as a white-
light source or LED. Reading is done at l1 in the absence of gating light. Depending on the
specific implementation, either the gating light acts to sensitize the material, in which case it is
desirable for the sensitivity to decay after the writing cycle, or the gating light ionizes centers in
which a temporary grating can be written at the wavelength l1 . Figure 18 shows a schematic of
energy levels comparing the two-color and one-color schemes for a photorefractive material with
localized centers in the bandgap. A very important and unique figure of merit for photon-gated
holography is the gating ratio, the ratio between the sensitivity of the material in the presence and
absence of gating light.
Reduced stoichiometric lithium niobate shows both one-color sensitivity in the blue-green spectral
region and two-color sensitivity for writing in the near IR and gating with blue-green light . From
this it can be seen that the gating light also produces erasure. This is a consequence of the broad
spectral features of reduced or Fe-doped lithium niobate. Considerable progress is envisaged if a
better separation of gating and erasing functions can be achieved by storing information in deeper
traps and/or using wider-bandgap materials. Figure 19 compares one-color and two-color writing
in a sample of reduced, near-stoichiometric lithium niobate to illustrate the nondestructive readout
that can be achieved. The gating ratio in this case was in excess of 5000.
crystal promote high optical quality and large boules. Crystals of nominally undoped lithium
niobate, grown with a stoichiometry (SLN) of 49.7% by a special double-crucible technique, were
compared with those of the congruent composition (CLN). Strong differences were observed,
As we have seen, the most important photorefractive properties for two-color holographic data
storage are the gating ratio (measuring the degree of nonvolatility), sensitivity, M# or dynamic
range, dark decay, and optical quality. Table 2 shows most of these properties for stoichiometric
and congruent compositions compared to the behavior of conventional one-color Fe-doped lithium
niobate. Photorefractive sensitivity for two-color recording in lithium niobate is linear in the
gating light intensity, Ig, only at low values of Ig because of competition between gating and
erasing. Hence, the sensitivity in terms of incident intensities Sh2 is defined similarly to that for
onecolor processes [see Equation (2)], but for a fixed and reasonably low value of Ig = 1 W/cm2.
Summarizing the results of Table 2, the sensitivity gains for two-color recording in reduced,
nearly stoichiometric lithium niobate with respect to the congruent material are 153 for increased
stoichiometry and 203 for degree of reduction. In addition, lowering the gating wavelength from
520 nm to 400 nm gains a further factor of 10, and cooling from 208C to 08C a factor of 5.
There is an interesting difference in the behavior of one- and two-color materials with regard to
dynamic range. In a one-color material, the M# is proportional to the modulation index or fringe
visibility of the optical interference pattern, m = 2(I1I2)1/2/(I1 + I2). However, in a two-color
material, the writing light (I1 + I2) does not erase the hologram, and the M# is proportional to
(I1I2)1/2 . As a result, for object and reference beams of equal intensity, the M# is proportional to
the writing intensity. While this provides a general way of increasing the dynamic range in a two-
color material, the writing power requirements in the present material system become rather high
in order to achieve a substantial increase in M#.
Instead of amplifying the role of the intrinsic shallow levels with stoichiometry, an alternative
scheme for implementing two-color holography in lithium niobate is the introduction of two
impurity dopants. One trap, such as Mn, serves as the deep trap from which gating occurs, while a
more shallow trap, such as Fe, provides the more shallow intermediate level for gated recording.
While this scheme provides more opportunities for tuning through choice of dopants, in general it
is difficult in LiNbO3 to separate the two absorption bands enough to provide high gating ratios
and thus truly nonvolatile storage. In addition, while M# improves monotonically with writing
intensity for stoichiometric lithium niobate, with the two-trap method M# is maximized at a
particular writing intensity, thus creating an undesirable tradeoff between recording rate and
dynamic range.
On the other hand, holographic data storage currently suffers from the relatively high component
and integration costs faced by any emerging technology. In contrast, magnetic hard drives, also
known as direct access storage devices (DASD), are well established, with a broad knowledge
base, infrastructure, and market acceptance. Are there any scenarios conceivable for holographic
data storage, where its unique combination of technical characteristics could come to bear and
overcome the thresholds faced by any new storage technology?
Four conceivable product scenarios are shown in Figure 20. The first two scenarios use read/write
media, while the latter two are designed for WORM materials, which are much easier to develop
but must support data retention times as long as tens of years. The first scenario [Figure 20(a)]
takes advantage of rapid optical access to a stationary block of media, resulting in a random-
access time of the order of 10 ms. The capacity is limited to about 25 GB by the size of the block
of media that can be addressed by simple, inexpensive optics. Such a device could bridge the gap
between conventional semiconductor memory and DASD, providing a nonvolatile holographic
cache with an access time that is between DASD and dynamic random-access memory (DRAM).
Using the same optical components but replacing the stationary block of media with a rotating
disk results in performance characteristics similar to those of a disk drive, albeit with terabytes
(1012 bytes) of capacity per platter [Figure 20(b)]. In the CD-ROM type of embodiment [Figure
20(c)], holographic data storage takes advantage of the fact that single-exposure full-disk
replication has been demonstrated. The player for the holographic ROM is conceptually very
simple: The photodiode from a conventional ROM player is replaced by a CMOS camera chip,
and the reconstructed data page is then imaged with suitable optics onto that camera. Combining
one of the DASD-type R/W heads and possibly a number of CD-ROM-type readers, a robotic
picker, and sufficient tiles of media, a data warehouse with petabyte (1015 bytes) capacity in a
standard 19-inch rack is conceivable [Figure 20(d)]. While the access time to any of the stored
files is determined by the robotic picker and will be of the order of tens of seconds, the aggregate
sustained data rate could be enormous. In this scenario, the relatively high component cost of a
read/write holographic engine is amortized over a large volume of cheap media to obtain
competitive cost per gigabyte. Will one of these scenarios with data stored in holograms
materialize and become reality in the foreseeable future? In collaboration and competition with
a large number of scientists from around the globe, we continue to study the technical feasibility
of holographic storage and memory devices with parameters that are relevant for real-world
applications. Whether this research will one day lead to products depends on the insights that we
gain into these technical issues and how well holography can compete with established techniques
BRIEF HISTORY
Although holography was conceived in the late 1940s, it was not considered a potential storage
technology until the development of the laser in the 1960s. The resulting rapid development of
holography for displaying 3-D images led researchers to realize that holograms could also store
data at a volumetric density of as much as 1/ where is the wave-length of the light beam used.
Since each data page is retrieved by an array of photo detectors, rather than bi-by-bit, the
holographic scheme promises fast readout rates as well as high density. If a thousand holograms,
each containing a million pixels, could be retrieved every second, for instance, then the output
data rate would reach 1 Gigabit per second.
In the early 1990s, interest in volume-holographic data storage was rekindled by the availability of
devices that could display and detect 2-D pages, including charge coupled devices (CCD),
complementary metal-oxide semiconductor (CMOS) detector chips and small liquid-crystal
panels. The wide availability of these devices was made possible by the commercial success of
digital camera and video projectors. With these components in hand, holographic-storages
researchers have begun to demonstrate the potential of their technology in the laboratory. By using
the volume of the media, researchers have experimentally demonstrated that data can be stored at
equivalent area densities of nearly 400 bits/sq. micron. (For comparison, a single layer of a DVD
disk stores data at ~ 4.7 bits/sq. micron) A readout rate of 10 gigabit per second has also been
achieved in the laboratory.
FEATURES
Data transfer rate: 1 gbps.
Computer Science Department Holographic Data Storage System (HDSS) 78
The technology permits over 10 kilobits of data to be written and read in parallel with a
single flash.
Most optical storage devices, such as a standard CD saves one bit per pulse.
HVDs manage to store 60,000 bits per pulse in the same place.
1 HVD – 5800 CDs – 830 DVD – 160 BLU-RAY Discs.
HVD STRUCTURE
HVD structure is shown in fig 3.1 the following components are used in HVD.
HVD STRUCTURE
STORAGE DATA
RECORDING DATA
A simplified HVD system consists of the following main components:
Blue or green laser (532-nm wavelength in the test system)
Beam splitter/merger
Mirrors
Spatial light modulator (SLM)
CMOS sensor
Polymer recording medium
The process of writing information onto an HVD begins with encoding the information into binary
data to be stored in the SLM. These data are turned into ones and zeroes represented as opaque or
translucent areas on a "page" -- this page is the image that the information beam is going to pass
through.
When the blue-green argon laser is fired, a beam splitter creates two beams. One beam, called the
object or signal beam, will go straight, bounce off one mirror and travel through a spatial-light
modulator (SLM). An SLM is a liquid crystal display (LCD) that shows pages of raw binary data
as clear and dark boxes.
The information from the page of binary code is carried by the signal beam around to the light-
sensitive lithium-niobate crystal. Some systems use a photopolymer in place of the crystal.
A second beam, called the reference beam, shoots out the side of the beam splitter and takes a
separate path to the crystal. When the two beams meet, the interference pattern that is created
stores the data carried by the signal beam in a specific area in the crystal -- the data is stored as a
hologram.
DATA IMAGE
In the HVD read system, the laser projects a light beam onto the hologram – a light beam -- a light
beam that is identical to the reference beam.
An advantage of a holographic memory system is that an entire page of data can be retrieved
quickly and at one time. In order to retrieve and reconstruct the holographic page of data stored in
the crystal, the reference beam is shined into the crystal at exactly the same angle at which it
entered to store that page of data. Each page of data is stored in a different area of the crystal,
based on the angle at which the reference beam strikes it.
The key component of any holographic data storage system is the angle at which the reference
beam is fired at the crystal to retrieve a page of data. It must match the original reference beam
angle exactly. A difference of just a thousandth of a millimeter will result in failure to retrieve that
page of data.
During reconstruction, the beam will be diffracted by the crystal to allow the recreation of the
original page that was stored. This reconstructed page is then projected onto the CMOS, which
interprets and forwards the digital information to a computer.
Automatically search for an empty space on the disc to avoid recording over a program.
Users will be able to connect to the Internet and instantly download subtitles and other
interactive movie features
The transfer rate of HVD is up to 1 gigabyte (GB) per second which is 40 times faster than
DVD.
An HVD stores and retrieves an entire page of data, approximately 60,000 bitsof
information, in one pulse of light, while a DVD stores and retrieves one bitof data in one
pulse of light.
DISADVANTAGES OF HDSS
Manufacturing cost HDSS is very high and there is a lack of availability of resources which are
needed to produce HDSS. However, all the holograms appear dimmer because their patterns must
share the material's finite dynamic range. In other words, the additional holograms alter a material
that can support only a fixed amount of change. Ultimately, the images become so dim that noise
creeps into the read-out operation, thus limiting the material's storage capacity.
A difficulty with the HDSS technology had been the destructive readout. The re- illuminated
reference beam used to retrieve the recorded information also excites the donor electrons and
disturbs the equilibrium of the space charge field in a manner that produces a gradual erasure of the
recording. In the past, this has limited the number of reads that can be made before the signal-to
-noise ratio becomes too low. Moreover, writes in the same fashion can degrade previous writes in
the same region of the medium. This restricts the ability to use the three-dimensional capacity of a
photorefractive for recording angle-multiplexed holograms. You would be unable to locate the data
if there’s an error of even a thousandth of an inch.
INTERESTING FACTS
It has been estimated that the books in the U.S. Library of Congress, the largest library in the
world, could be stored on Six HVDs. The pictures of every landmass on Earth - like the ones
shown in Google Earth - can be stored on two HVDs.
With MPEG4 ASP encoding, a HVD can hold anywhere between 4,600-11,900 hours of video,
which is enough for non-stop playing for a year.
STANDARDS
On December 9, 2004 at its 88th General Assembly the standards body Ecma International created
Technical committee 44, dedicated to standardizing HVD formats based on Optware’s technology.
On June 11, 2007, TC44 published the first two HVD standards ECMA-377, defining a 200 GB
HVD “recordable cartridge” and ECMA-378,defining a 100 GB HVD-ROM disc. Its next stated
goals are 30 GB HVD cards and submission of these standards to the International Organization
for Standardization for ISO approval.
POSSIBLE APPLICATION FIELDS
There are many possible applications of holographic memory. Holographic memory systems can
potentially provide the high speed transfers and large volumes of future computer system. One
possible application is data mining.
Web references:
www.holopc.com
www.wikeipedia.com
www.engeeniringseminars.com
www.computer.howstuffworks.com
www.tech-faq.com/hvd.shtml
www.ibm.com - IBM Research Press Resources Holographic Storage