Primer: Memory

These posts, tagged “Primer,” are posted for two reasons: 1). to help me get better at teaching non-scientists about science-related topics; and 2). to help non-scientists learn more about things they otherwise would not. So, while I realize most people won’t read these, I’m going to write them anyway, partially for my own benefit, but mostly for yours.

The whole idea of “memory” has intrigued me for quite awhile, arguably before I was even that interested in science in general.  Part of this is my attraction to all things computers.  I think I build my first computer (rather, helped Dad build one…) back in the late-90s, and at that time, I began to understand all of the components that make it function.  The idea of “input/output,” the function of a central processing unit (CPU), RAM and hard drives…all of these things proved relatively easy to grasp, and in light of these general functions, it made my understanding of the brain a bit easier in the process.

Let’s think of it this way.  You interact with computers in different ways, but one way is with a keyboard.  You type something into the keyboard and the data you input is converted by the CPU into something that can be understood by the system, in this case, binary code (i.e. a series of “1s” and “0s”).  All of your inputs from the keyboard are stored in RAM for faster, short-term access.  If you click the “Save” button on whatever you’re doing, however, the data stored in RAM gets sent to the slower-access hard drive.  As you open programs, information is pulled off the hard drive and into RAM so that your CPU can process it faster, and then you and your keyboard can get at and interact with it.  This is why, in general, having more RAM speeds up your computer because it can pull larger and larger programs into RAM so your CPU can get at it easier, and thus, you can interact with it faster.

In very basic terms, your brain works the same way.  We have inputs in the form of our 5 senses.  The information from those senses gets encoded by your brain’s Cerebral Cortex and is stored temporarily in the Hippocampus (i.e. RAM) before being encoded for long-term storage back in other regions of the Cortex (i.e. hard drive).  Most of the time, your brain “Saves” it’s data to the Cortex at night, which is why sleeping is so very important.  The “processing” portion of this paradigm can be confusing, but keep in mind that the brain is divided up into specific regions.  There’s a “visual cortex,” an “auditory cortex,” etc.  These regions (within the Cortex…) interpret what each sense gives you and then sends that information through the Temporal and Parietal Lobes (also in the Cortex).  From there, the information is spread to the Hippocampus (i.e. RAM) for “integration” before being set as full, long-term memories out in the rest of the brain.

How is that information stored, you may ask?  Again, it’s much like a hard drive.  If you’ve used computers extensively, you know that hard drives are divided up into “sectors” (ever get a disc read error that says “bad sector?”).  When you have a new hard drive, you start with a clean slate.  As you install programs and add files, it gets filled up.  Once you delete something, that sector isn’t really “deleted,” but it is removed from your access: it isn’t really “deleted” until it’s overwritten by something else (which is why you can sometimes retrieve old files off a hard drive that you thought may have been deleted).  Whenever you “defragment” your hard drive, you are basically trying to rearrange those programs to keep everything closer together, and thus, quicker to access.  The data that’s encoded on the hard drive is done in “1s” and “0s” (i.e. binary code).  Each 1 or 0 is considered to be a “bit,” while a set of eight 1s and 0s (e.g. 11010101, 10011010, etc.) is considered a “byte.”  This is where “kilobytes,” “megabytes” and “gigabytes” come from.

The idea of 1s and 0s comes from logic, specifically the definitions of “True” (i.e. 1) and “False” (i.e. 0).  If you have a “1,” then you have a connection.  If you have a “0,” then you don’t.

Bringing this back to neuroscience, the same general rule appears to apply with regards to memories, or the concept of “learning” in general.  In order to form a memory, it needs to be encoded much like your hard drive is: in a series of combinations of connections (or missed connections) between neurons spanning the entire brain.  There are various molecular mechanisms that can account for these connections, or lack of connections, and those go back to receptor theory.  Remember that neurotransmission involves the release of a neurotransmitter (e.g. dopamine, adrenaline, etc.) from one neuron to bind with a receptor on another.  If a neuron stops receiving signals from another neuron, it will remove its receptors from the outside of the cell, thus limiting or negating the signal.  If, however, a neuron keeps getting increased signaling from an adjacent neuron, the subsequent neuron will increase the number of receptors on the outside of the cell, thus making it easier to signal.  Therefore, we have a mechanism for strengthening or weakening the connections between two neurons.

One could consider a “strengthened” neuronal connection to be a “1” and a “weakened” neuronal connection to be a “0.”  It is in this way, it is thought, that memories can be formed on a cell-to-cell basis.

These neurons that memories are stored in are located throughout the brain, similarly to “sectors” on your hard drive.  As you stop using certain memories, the synapses of those neurons weaken to the point where they can be, effectively, “overwritten” in favor of a new memory.  This is also how the idea of “repressed memories” can come about, in that you can have a memory stored in a region of your brain that you have forgotten about, but can re-manifest later: if it isn’t overwritten, it’s still there.

From a molecular standpoint, scientists have a pretty good idea how memory “works,” but being able to decode those memories is a whole different beast.  Returning to our computer metaphor, imagine knowing nothing about computers and finding a hard drive.  What would you do with it?  Would you take it apart?  How would you know what it was?  Or what it contained?  And once you figured out that it, somehow, contained information, how would you read it?  If you eventually found out that it involved 1s and 0s, how would you know how those 1s and 0s were organized across the hard drive, and then finally, what they told you?

This is why it’s highly unlikely that we’ll ever be able to make or see memories like we do in the movies, at least, not for a very long time.  It’s one thing to understand the basis for how it works, but it’s a whole other thing to try and figure out how it’s organized within a system like the human brain.  Also, it’s been estimated that the human brain contains terabytes of information, which translates to 8,000,000,000,000 to 8,000,000,000,000,000 individual 1s and 0s, or individual neuronal connections.

Imagine looking at a sheet (or multiple sheets…) of paper with that many 1s and 0s on it and trying to decide which version of Windows it represents.  Or where your dissertation is…not the Word Document, but the PDF.  That’s what we’re talking about.

So yeah, I just find the concept of memory to be fascinating.  With modern computers, we’re effectively reverse-engineering the human brain and, in doing so, learning more and more about how technological and biological computation can work.  But next time you see some “memory reading” device on TV, bear in mind what’s actually required to make that technology work.

Leave a Reply

Your email address will not be published. Required fields are marked *