Introduction
Most memory
devices store and retrieve data by addressing specific memory locations. As a
result, this path often becomes the limiting factor for systems that rely on
fast memory accesses. The time required to find an item stored in memory can be
reduced if the item can be identified for access by its content rather than by
its address. A memory that is accessed
in this way is called content Addressable memory or CAM. CAM provides a performance advantage over
other memory search algorithms, such as binary or tree-based searches, by
comparing the desired information against the entire list of pre-stored entries
simultaneously,
Basics
of CAM
Since CAM is
an outgrowth of Random Access Memory (RAM) technology, in order to understand
CAM, it helps to contrast it with RAM. A
RAM is an integrated circuit that stores data temporarily. Data is stored in a RAM at a particular
location, called an address. In a RAM,
the user supplies the address and gets back the data. The number of address
line limits the depth of a memory using RAM, but the width of the memory can be
extended as far as desired. With CAM,
the user supplies the data and gets back the address. The CAM searches through
the memory in one clock cycle and returns the address where the data is
found. The CAM can be preloaded at
device startup and also be rewritten during device operation. Because the CAM does not need address lines
to find data, the depth of a memory system using CAM can be extended as far as
desired, but the width is limited by the physical size of the Memory.
CAM
architecture:
A.
NOR Cell
The NOR cell
implements the comparison between the complementary stored bit, D (and ), and
the complementary search data on the complementary searchline, SL (and ), using
four comparison transistors, through , which are all typically minimum-size to
maintain high cell density. These transistors implement the pulldown path of a
dynamic XNOR logic gate with inputs SL and D. Each pair of transistors, and forms
a pulldown path from the matchline, ML, such that a mismatch of SL and D
activates least one of the pulldown path connecting ML to ground. A match of SL
and D disables both pulldown paths, disconnecting ML from ground.
NAND and NOR cell
NAND
Cell
The NAND cell
implements the comparison between the stored bit, D, and corresponding search
data on the corresponding searchlines, (SL, ), using the three comparison
transistors , , and , which are all typically minimum-size to maintain high
cell density. We illustrate the bit-comparison operation of a NAND cell through
an example. Consider the case of a match when and . Pass transistor is ON and
passes the logic “1” on the SL to node B. Node B is the bit-match node which is
logic “1” if there is a match in the cell. The logic “1” on node B turns ON
transistor . Note that is also turned ON in the other match case when and . In
this case, the transistor passes a logic high to raise node B. The remaining
cases, where , result in a miss condition, and accordingly node B is logic “0”
and the transistor is OFF.
NOR
Matchline:
In schematic
form, how NOR cells are connected in parallel to form a NOR matchline, ML.
While we show binary cells in the figure, the description of matchline operation
applies to both binary and ternary CAM. A typical NOR search cycle operates in
three phases: searchline precharge, matchline precharge, and matchline
evaluation. First, the searchlines are precharged low to disconnect the
matchlines from ground by disabling the pulldown paths in each CAM cell.
Second, with the pulldown paths disconnected, the transistor precharges the
matchlines high. Finally, the searchlines are driven to the search word values,
triggering the matchline evaluation phase. In the case of a match, the ML
voltage, , stays high as there is no discharge path to ground. In the case of a
miss, there is at least one path to ground that discharges the matchline. The
matchline sense amplifier (MLSA) senses the voltage on ML, and generates a
corresponding full-rail output match result. We will see several variations of
this scheme for evaluating the state of NOR matchlines in Section III. The main
feature of the NOR matchline is its high speed of operation.
NOR matchline
NAND Matchline:
A number of cells are cascaded to form the
match line (this is, in fact, a match node, but for consistency we will refer
to it as ML). For the purpose of explanation, we use the binary version of the
NAND cell, but the same description applies to the case of a ternary cell. On
the right of the figure, the precharge pMOS transistor,, sets the initial voltage of
the match line, ML, to the supply voltage, . Next, the evaluation nMOS
transistor, turns ON. In the case of a match, all nMOS transistors through are
ON, effectively creating a path to ground from the ML node, hence discharging
ML to ground. In the case of a miss, at least one of the series nMOS
transistors, through , is OFF, leaving the ML voltage high. A sense amplifier, MLSA,
detects the difference between the match (low) voltage and the miss (high)
voltage. The NAND match line has an explicit
NAND matchline
Type
of cam :
Usually, two basic
types of CAMs
1)
Binary
CAM (BiCAM)
2)
Ternary
CAM (TCAM)
Binary CAM : Binary
CAM is the simplest type of CAM
which uses data search words consisting entirely of 1s and 0s.
Ternary
Cells : Ternary
CAM can store ‘0’ , ‘1’ , ‘x’. Ternary
CAM (TCAM) allows a third
matching state of "X" or "Don't Care" for one or more bits
in the stored dataword, thus adding flexibility to the search. For example, a
ternary CAM might have a stored word of "10XX0" which will match any
of the four search words "10000", "10010",
"10100", or "10110". The added search flexibility comes at
an additional cost over binary CAM as the internal memory cell must now encode
three possible states instead of the two of binary CAM. This additional state
is typically implemented by adding a mask bit ("care" or "don't
care" bit) to every memory cell.
..................to be continued