Memory cell (binary)

The memory cell is the fundamental building block of computer memory. The memory cell is an electronic circuit that stores one bit of binary information and it must be set to store a logic 1 (high voltage level) and reset to store a logic 0 (low voltage level). Its value is maintained/stored until it is changed by the set/reset process. The value in the memory cell can be accessed by reading it.

Over the history of computing many different memory cell architectures have been used including core memory and bubble memory, but the most common ones used are flip-flops and capacitors.

The SRAM, static ram memory cell is a type of flip-flop circuit, usually implemented using FETs. These require very low power when not being accessed.

A second type, DRAM is based around a capacitor. Charging and discharging this capacitor can store a '1' or a '0' in the cell. However, this capacitor will slowly leak away, and must be refreshed periodically. Because of this refresh process, DRAM uses more power, but can achieve greater storage densities.

History

On December 11, 1946 Freddie Williams applied for a patent on his cathode-ray tube (CRT) storing device (Williams tube) with 128 40-bit words. It was operational in 1947 and is considered the first practical implementation of random-access memory.[1] In that year, the first patent applications for magnetic-core memory were filed by Frederick Viehe. An Wang, Ken Olsen and Jay Forrester also contributed to its development. The first modern memory cells were introduced in 1969, when John Schmidt designed the first 64-bit MOS p-channel SRAM. The first bipolar 64-bit SRAM was released by Intel in 1969 with the 3101 Schottky TTL. One year later it released the first DRAM chip, the Intel 1103.[2]

Description

The memory cell is the fundamental building block of memory. It can be implemented using different technologies, such as bipolar, MOS, and other semiconductor devices. It can also be built from magnetic material such as ferrite cores or magnetic bubbles.[3] Regardless of the implementation technology used, the purpose of the binary memory cell is always the same. It stores one bit of binary information and it must be set to store a 1 and reset to store a 0.[4]

Implementation

The following schematics detail the three most used implementations for memory cells :

DRAM Cell (1 Transistor and one capacitor)
SRAM Cell (6 Transistors)
Clocked J/K flip flop

Operation

DRAM memory cell

Die of the MT4C1024 integrating one-mebibit of DRAM memory cells.
The storage element of the DRAM memory cell is the capacitor labeled (4) in the diagram above. The charge stored in the capacitor degrades over time, so its value must be refreshed (read and rewritten) periodically. The nMOS transistor (3) acts as a gate to allow reading or writing when open or storing when closed.
For reading the Word line drives a logic 1 (voltage high) into the gate of the nMOS transistor (3) which makes it conductive and the charge stored at the capacitor (4) is then transferred to the bit line. The bit line will have a parasitic capacitance (5) that will drain part of the charge and slow the reading process. The capacitance of the bit line will determine the needed size of the storage capacitor (4). It is a trade-off. If the storage capacitor is too small, the voltage of the bit line would take too much time to raise or not even rise above the threshold needed by the amplifiers at the end of the bit line. Since the reading process degrades the charge in the storage capacitor (4) its value is rewritten after each read.
The writing process is the easiest, the desired value logic 1 (high voltage) or logic 0 (low voltage) is driven into the bit line. The word line activates the nMOS transistor (3) connecting it to the storage capacitor (4). The only issue is to keep it open enough time to ensure that the capacitor is fully charged or discharged before turning off the nMOS transistor (3).

SRAM memory cell

SRAM memory cell depicting Inverter Loop as gates
The working principle of SRAM memory cell can be easier to understand if we draw transistors M1 through M4 as logic gates. That way we can clearly see that at its heart the cell storage is built using of two cross-coupled inverters. This simple loop, creates a bi-stable circuit. A logic 1 at the input of the first inverter turns into a 0 at its output, and it is fed into the second inverter which transforms that logic 0 back to a logic 1 feeding back the same value to the input of the first inverter. That creates a stable state that does not change over time. Similarly the other stable state of the circuit is to have a logic 0 at the input of the first inverter. After been inverted twice it will also feedback the same value.
Therefore there are only two stable states that the circuit can be in:
  • = 0 and   = 1
  • = 1 and   = 0
To read the contents of the memory cell stored in the loop, the transistors M5 and M6 must be turned on. when they receive voltage to their gates from the word line (), they become conductive and so the and    values get transmitted to the bit line () and to its complement (). Finally this values get amplified at the end of the bit lines.
The writing process is similar, the difference is that now the new value that we want to store in the memory cell is driven into the bit line () and into its complement (). Next transistors M5 and M6 are open by driving a logic one (voltage high) into the word line () connecting the bit lines to the loop. There are two possible cases:
  1. If the value of the loop is the same as the new value driven, there is no change.
  2. If the value of the loop is different from the new value driven there are two conflicting values, in order for the voltage in the bit lines to overwrite the output of the inverters, the size of the M5 and M6 transistors must be larger than that of the M1-M4 transistors to allow more current to flow through them and tip the voltage in the new value direction, the loop will then amplify it to full rail.

Flip flop

The flip-flop has many different implementations, its storage element is usually a Latch consisting of a NAND gate loop or a NOR gate loop with additional gates used to implement clocking. Its value is always available for reading as an output. The value remains stored until it is changed through the set or reset process.

Applications

Square array of DRAM memory cells being read

Logic circuits without memory cells or feedback paths are called combinational, their outputs values depend only on the current value of their input values. They do not have memory. But memory is a key element of digital systems. In computers, it allows to store both programs and data and memory cells are also used for temporary storage of the output of combinational circuits to be used later by digital systems. Logic circuits that use memory cells are called sequential circuits. Its output depends not only on the present value of its inputs, but also on the circuits previous state, as determined by the values stored on is memory cells. These circuits require a timing generator or clock for their operation.[5]

Computer memory used in computer systems is built mainly out of DRAM cells, since the layout is much smaller than SRAM, it can be more densely packed yielding cheaper memory with greater capacity. Since the DRAM memory cell stores its value as the charge of a capacitor, and there are current leakage issues, its value must be constantly rewriten. this is one of the reasons that make DRAM cells slower than the larger SRAM (Static RAM) cells, which has its value always available. That is the reason why SRAM memory is used for on-chip cache included in modern microprocessor chips.[6]

See also

References

  1. O’Regan, Gerard (2013). Giants of Computing: A Compendium of Select, Pivotal Pioneers. Springer Science & Business Media. p. 267. ISBN 1447153405. Retrieved 13 December 2015.
  2. W. Pugh, Emerson; R. Johnson, Lyle; H. Palmer, John (1991). IBM's 360 and Early 370 Systems. MIT Press. p. 706. ISBN 0262161230. Retrieved 9 December 2015.
  3. D. Tang, Denny; Lee, Yuan-Jen (2010). Magnetic Memory: Fundamentals and Technology. Cambridge University Press. p. 91. ISBN 1139484494. Retrieved 13 December 2015.
  4. Fletcher, William (1980). An engineering approach to digital design. Prentice-Hall. p. 283. ISBN 0-13-277699-5.
  5. Microelectronic Circuits (Second ed.). Holt, Rinehart and Winston, Inc. 1987. p. 883. ISBN 0-03-007328-6.
  6. "La Question Technique : le cache, comment ça marche ?". PC World Fr.
This article is issued from Wikipedia - version of the 9/8/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.